Understanding MWI: A Newbie's Guide to Quantum Physics and the Multiverse

  • Thread starter Thread starter confusedashell
  • Start date Start date
  • Tags Tags
    Mwi
  • #101
Hans de Vries said:
In David Deutsch's picture the atom is a core with a classical particle rotating around it, taking one path in one world and a different path in another world. This is not correct.
Attributing this view to Deutsch seem incorrect too. Do you have a quote where he did actually state this?

Hans de Vries said:
With a single human body containing ~10^30 elementary particles with all of them splitting up into endless numbers of paths at the femtometer/attosecond scale, the number of different worlds add up very fast, for a single human, let alone for an entire world.
Since in MWI worlds are just an emergent feature of the wavefunction, arguing on the basis of their number does not seem quite significant.
 
Physics news on Phys.org
  • #102
JesseM said:
Plenty of quantum phenomena can't be explained with classical optics, like entanglement or quantum computing.

The quantum computing videos from David Deutsch I've seen could
be explained by classical optics. You might see people agreeing here
including those who consider MWI to have attractive sides (vanesch).

JesseM said:
Are you familiar with the phenomenon of decoherence in ordinary QM? If you have a small subsystem A interacting thermally with another system B, the interactions can make it so that the reduced state of the subsystem A becomes arbitrarily close to a "mixed state" in which there is virtually no interference between the different elements of the superposition for A (see my post #15 on this thread for more). As I understand it, this is basically how the many-worlds interpretation explains why macroscopically different "worlds" don't noticeably interfere with one another.

It's hard to see how there can't be interference, non interference occurs
only in orthogonal states, 90 degrees for bosons and 180 degrees for fermions.
This means there are at most two different states per particle which do
not interfere.

For the rest, everything is based on interference. From propagator theory:
A wavefunction moves in a certain direction because all other directions
are interfered out destructively, that is, there's no motion without interference.

JesseM said:
Why "classical particle"? The paths in the path-integral picture don't behave like the paths of classical particles. Presumably you could in principle make correct predictions about the probability an electron will be detected at different positions around the nucleus by summing all possible paths, I think Deutsch is just adding the idea that each of these paths is a real electron, a philosophical gloss which shouldn't change the physical analysis.

The propagators of massive particles reflect continuously (the interacting
left and right chiral components), but this is not the picture Deutsch gives
with "In our world the particle goes through one split and in another world
it goes through another split" This is a classical picture.

JesseM said:
Yes, I get that. But again, why is this anything more than an argument from incredulity, i.e. "I find it incredible there could be so many versions of me"? If the universe is spatially infinite there would also be an infinite number of slightly different versions of you at sufficiently great spatial distances, is this a good scientific or philosophical argument for believing space must be finite?

What bothers me is that the different worlds do not interact as predicted
by quantum mechanics, but rather they all exist mostly independently without
disturbing each other. Especially since older ideas which claimed that particles
can only interfere with them self and not with other particles of the same kind
have been proven wrong by experiment.

Regards, Hans
 
  • #103
Hans de Vries said:
The quantum computing videos from David Deutsch I've seen could
be explained by classical optics. You might see people agreeing here
including those who consider MWI to have attractive sides (vanesch).
Do you remember a post by vanesch where he discusses this? In any case, surely you're not arguing that all quantum phenomena can be explained by classical optics (violations of the Bell inequality obviously can't, for example), so if you agree the quantum formalism is needed for certain situations, then whatever your interpretation of the quantum formalism, wouldn't you apply the same interpretation to any situation which physicists analyze using QM, like the double-slit experiment?
Hans de Vries said:
It's hard to see how there can't be interference
From what I remember, the interference terms in the "reduced state" for a certain subsystem (which are apparently the off-diagonal terms in the density matrix) never actually disappear completely, but they do decay exponentially. For example, this paper says:
The quantum decoherence process is elegantly expressed in the framework of the reduced density matrix of the quantum register. When no coupling to the environment is present, the reduced density matrix simply follows a Heisenberg-type evolution. As soon as the coupling to the environment is introduced, the off-diagonal terms of the reduced density matrix of the register decay with respect to time. This is often referred to as phase damping. In the simplest case of a single two level system connected to an environment, the off-diagonal elements of the reduced density matrix decay exponentially in time as ~e^−q(t) , where t is the time and the function q(t) depends on the strength of the coupling to the environment.
I have never studied decoherence formally so I don't claim to understand why this is true or even precisely what it means, I'd suggest you might at least want to do some of your own reading on the subject instead of dismissing it based on my layman's summary, as far as I know decoherence is a widely-accepted consequence of applying the rules of QM to the problem of a quantum system which is in thermal interaction with a larger environment.
Hans de Vries said:
The propagators of massive particles reflect continuously (the interacting
left and right chiral components), but this is not the picture Deutsch gives
with "In our world the particle goes through one split and in another world
it goes through another split" This is a classical picture.
I don't quite understand your objection here, are you just objecting that he makes it sound like there are only two paths involved? If so he'd probably say he was simplifying for a general audience, in fact you have to integrate over an infinite number of distinct paths through each slit.
Hans de Vries said:
What bothers me is that the different worlds do not interact as predicted
by quantum mechanics, but rather they all exist mostly independently without
disturbing each other.
Again, I think you really need to delve into the theory of decoherence to understand why many-worlds advocates say the different worlds interact only weakly (they don't say that they aren't interacting at all, I've seen a quote by Deutsch where he points out that the interference terms never disappear completely even with decoherence).
 
  • #104
Hans de Vries said:
What bothers me is that the different worlds do not interact as predicted by quantum mechanics, but rather they all exist mostly independently without disturbing each other.

The classical picture is an approximation, Deutsch considers worlds unsharp and affecting each other. To discuss his views, it is perhaps more appropriate to refer to his actual papers, eg. "The Structure of the Multiverse", arxiv:quant-ph/0104033

"if reality – which in this context is called the multiverse – is indeed literally quantum-mechanical, then it must have a great deal more structure than merely a collection of entities each resembling the universe of classical physics.[...] "

"Since a generic quantum computational network does not perform anything like a classical computation on a substantial proportion of its qubits for many computational steps, it may seem that when we extend the above conclusions to the multiverse at large, we should expect parallelism (ensemble-like systems) to be confined to spatially and temporally small, scattered pockets. The reason why these systems in fact extend over the whole of spacetime with the exception of some small regions (such as the interiors of atoms and quantum computers), and why they approximately obey classical laws of physics, is studied in the theory of decoherence (see Zurek 1981, Hartle 1991)."

"For present purposes, note only that although most of the descriptors of physical systems throughout spacetime do not obey anything like classical physics, the ones that do, form a system that, to a good approximation, is not only causally autonomous but can store information for extended periods and carry it over great distances. It is therefore that system which is most easily accessible to our senses – indeed, it includes all the information processing performed by our sense organs and brains. It has the approximate structure of a classical ensemble comprising ‘the universe’ that we subjectively perceive and participate in, and other ‘parallel’ universes."
 
Last edited:
  • #105
Hans de Vries said:
The quantum computing videos from David Deutsch I've seen could
be explained by classical optics. You might see people agreeing here
including those who consider MWI to have attractive sides (vanesch).

JesseM said:
Do you remember a post by vanesch where he discusses this?

https://www.physicsforums.com/showthread.php?p=1070970#post1070970


Regards, Hans
 
  • #106
Hans de Vries said:
OK, I thought you were saying that some significant aspects of quantum computation itself could be explained through classical optics, this is just a discussion of a "curious feature about a beam splitter". Anyway, see my comment above--if one agrees with Deutsch that at least some quantum phenomena, like the fast factorization of large numbers using algorithm[/url], would be most naturally understood in terms of the many-worlds interpretation (Deutsch sometimes talks about quantum computers achieving their rapid speeds by running huge numbers of computations in parallel, in different 'worlds'), then it would be strange not to extend this to all phenomena that physicists analyze using QM, even if some of these phenomena can also be analyzed using classical optics. As an analogy, if you believe that spacetime curves in the neighborhood of a black hole, you wouldn't say that other phenomena involving gravitation don't involve curved spacetime just because some of them can also be analyzed perfectly well using Newtonian gravity.
 
Last edited by a moderator:
  • #107
JesseM said:
OK, I thought you were saying that some significant aspects of quantum computation itself could be explained through classical optics,

No, I wouldn't say so, although Shor's algorithm has also been
implemented with http://www.sciencenews.org/articles/20010519/fob4.asp". What we are
discussing here is MWI and the claim of other universes "hiding
on hyperplanes via some yet to discover theory of quantum
gravity" according to Deutsch.

JesseM said:
this is just a discussion of a "curious feature about a beam splitter".

Well... Its actually what Deutsch calls quantum computing in his online
course, although it is best understood using classical optics. Nevertheless
he makes statements like "The computing is done in another universe"
Subsequently you get people claiming this as "The prove of the MWI"

http://www.quiprocone.org/Protected/Lecture_2.htmRegards, Hans
 
Last edited by a moderator:
  • #108
Hans de Vries said:
No, I wouldn't say so, although Shor's algorithm has also been
implemented with http://www.sciencenews.org/articles/20010519/fob4.asp".
The article seems to be talking about algorithm[/url] rather than Shor's algorithm, but that's a minor quibble obviously, it's certainly interesting that any of the sort of speedups associated with quantum computers might be achievable using classical optics. But the article doesn't go so far as to say all quantum computations could be achieved with classical optics, instead it says "some other theorists had previously argued that a computer using classical physics can perform as well as any quantum computer in some calculations that involve only interference."
Hans de Vries said:
What we are
discussing here is MWI and the claim of other universes "hiding
on hyperplanes via some yet to discover theory of quantum
gravity" according to Deutsch.
Where did he make this claim? It sounds like he is speculating about future theories here rather than discussing the issue of interpreting our existing theory of QM, which is all that the MWI purports to do. It is of course possible that QM will turn out to be just a sort of approximation to some ultimate theory of quantum gravity or "theory of everything", and that untestable elements of existing interpretations (like the multiple 'worlds' of the MWI, or the FTL pilot wave of Bohmian mechanics, or the backwards causality of the transactional interpretation) will correspond to actual testable elements of the new theory.
Hans de Vries said:
Nevertheless
he makes statements like "The computing is done in another universe"
Subsequently you get people claiming this as "The prove of the MWI"
Well, anyone who thinks that any experiment can "prove" an interpretation is obviously confused or at least speaking sloppily--the most you can really argue is that certain physical results are more elegantly explained using one interpretation over another.
 
Last edited by a moderator:
  • #109
reilly said:
Why doesn't an MWI approach go back to the origins of probability theory, particularly conditional probability -- as in chains of events--, circa the 17th century? (I'll bet it actually does, but was jettisoned, so to speak, for whatever reasons, one of which I would guess was cumbersomeness. )

Regards,
Reilly Atkinson

Going back to the origin and axioms of probability theory is the way to go IMO. Glad to hear more people share this view. This is also where most of my serious philosophical objections are rooted (the effiency of applicability of the axioms of probability to reality).

The concept of objective probabilities makes no sense except as special cases. Also the concept of unitarity is highly suspect as it is relevant only the in concept of closed systems, but the whole point is that how do we deduce that we have a closed system? There is bound to be an uncertainty, and the limiting case where this uncertainty is insignificant certainly limits the domain of applicability? I personally think to solve these things, a reconstruction of the formalism should be made starting from the probability concepts and first principles.

/Fredrik
 
  • #110
Count Iblis said:
Forget this "splitting", "number of universes" etc. You just have the postulates of QM without wavefunction collapse. How can an observer collapse the state of the entire universe by just observing? :smile:

IMO. The only thing that collapses is the projection that lives inside the observer, which I see no more weird than similar to a bayesian update.

Why my view of the world change when I receive more information about it, is quite obvious. Whatever the world REALLY is, still has to be projected onto my perspective.

I think these words of Niels Bohr's still stands:

"It is wrong to think that the task of physics is to find out how nature is. Physics concerns what we can say about nature...".

/Fredrik
 
  • #111
JesseM said:
Where did he make this claim? It sounds like he is speculating about future theories here rather than discussing the issue of interpreting our existing theory of QM, which is all that the MWI purports to do. It is of course possible that QM will turn out to be just a sort of approximation to some ultimate theory of quantum gravity or "theory of everything", and that untestable elements of existing interpretations (like the multiple 'worlds' of the MWI, or the FTL pilot wave of Bohmian mechanics, or the backwards causality of the transactional interpretation) will correspond to actual testable elements of the new theory.

Here:

http://uk.arxiv.org/ftp/quant-ph/papers/0104/0104033.pdf

David Deutsch said:
This is reminiscent of the infinity of ways in which one can slice (‘foliate’) a spacetime into spacelike hypersurfaces in the general theory of relativity. Given such a foliation, the theory partitions physical quantities into those ‘within’ each of the hypersurfaces and those that relate hypersurfaces to each other.
...
...
Hence the theory presented here and the classical theory of foliation must in reality be two limiting cases of a single, yet-to-be-discovered theory – the theory of the structure of the multiverse under quantum gravity.


Regards, Hans
 
  • #112
I didn't follow this thread so forgive me for making fragmentary comments. (I do not adhere to MWI or any other particular camp)

Hans de Vries said:
Amusing when two solipsist aid each other in a discussion. At some point in time one would expect the two to get bitterly fighting about who is the real "source of the universe" and who is a product of imagination.

I think the conclusion they should arrive at is that there exists different views. I see no logical conflict in this. And those views that are favoured by the environment is those that will persist?

Who is real and who is a projection is mutual. I think of the identity of the observer as beeing the projection of the environment. Due to encoding constraints, all information about the environment can not possibly be projected onto a small observer, only a limited projection is sustained constituting the observer. But I guess that is also the key to explain the non-trivial dynamics we witness.

The solipsist may finally reach the agreement, that they consistently disagree about certain things. But generally two solipsists in the same environment will generally agree on the major part making up classical reality, mediated by the environment.

While the the objectivists will keep hunting their own tail failing to see that it is impossible to find an objective view that projects perfectly identically on two different observers :wink:

I don't see solipsism in physics having anything to do with fantasies or imagination in the negative sense. I merely see it as the state of the observer encodes the projection of the environment. Like Zurek said "What the observer knows is inseparable from what the observer is".

Meaning that "what views are valid" in the solipsism view? really means what observers will be seleceted/favoured in this environment? While it's easy to imagine that ANY observer has the chance, there is certainly going to be a selection that favours particular observers/particles/structures. And thus these "solipsist" views will come to dominate the environment, and thus giving appearance of agreement and objectivity. An unfit view or particle, will quickly destabilise in this environment so these contradictory views will not be a problem since they will be unlikely/rare.

/Fredrik
 
Last edited:
  • #113
Hans de Vries said:
Well... Its actually what Deutsch calls quantum computing in his onlinecourse, although it is best understood using classical optics.
How one-photon realizations could be best understood using classical optics, and how classical optics could help explaining aspects of a quantum theory?
 
  • #114
I think you guys are forgetting where MWI comes from: it doesn't come from "observation" or anything. It is just a straightforward application of the axioms of quantum theory to the physical system that is "the observer", assuming that this observer is just as well part of "the physics" as anything else. So it is the application of quantum theory to "big systems". Whether that is allowed or not is left in the middle. It might be that quantum theory doesn't apply, the way we know it, to these big systems. A good reason might be that gravity plays a role in these bigger systems. But given that we don't have anything else yet that replaces quantum theory, we apply it, knowing very well that we apply quantum theory outside of its "proved scope of applicability".

So what are these famous axioms ? It is 1) the principle of superposition, which says that if |A> is a state that the system can be in, and |B> is a state that the system can be in, then any linear combination a |A> + b |B> is also a state that the system can be in. And it is 2) the fact that the time evolution of the system is given by a unitary operator U(t,t').

Well, if you apply that bluntly to "Joe saw the red light go on" and "Joe saw the green light go on", and you realize that "the red light go on" was: the particle hit detector 1, and "the green light go on" was "the particle hit detector 2", then it is obvious that you arrive very quickly at situations where Joe's situation is described as:
a |Joe saw the red light go on> + b |Joe saw the green light go on>

And it is difficult to interpret this. We know that it will end up in one way or another that Joe has |a|^2 chance to see a red light, and |b|^2 chance to see a green light. We know that Joe won't see both.

This is what *elementary quantum theory tells us*. And MWI stops there, while other interpretations go on fiddling, because they don't like what they see.

MWI says that the ACTUAL QUANTUM STATE of Joe is now the above superposition, but that "a Joe conscience" will only be aware of one of the Joe states, which means that there are now "two Joe's" around, and if you happen to be a Joe, you have |a|^2 chance to be the first one, and |b|^2 chance to be the second one.

Projection interpretations say that "upon observation" we have to re-interpret the superposition a |Joe saw the red light go on> + b |Joe saw the green light go on>
as just a statistical mixture of possibilities, of which only one "really happened" of course.
However, this last statement has a difficulty: why should the "Joe" superposition be interpreted as a statistical mixture, while (|s> + |p>) of an electron, not ?
Because we know, in quantum theory, that there is an observable difference between a statistical mixture of 50% s and 50% p, and the superposition |s> + |p>.
So SOME superpositions are "genuine" superpositions, and others are "just statistical mixtures", but quantum theory doesn't tell us which one is which ? It depends on the declaration of some system to be an "observer" or not ? Is an electron an observer ?

Still other interpretations change entirely the formalism of quantum mechanics (such as Bohmian mechanics), and give then a more "Newtonian" mechanical interpretation to the added parts to the formalism. The difficulty here is that these extra formal elements have been introduced just for the sake of giving a mechanistic interpretation without having any dynamical significance, and moreover destroying certain symmetries in the laws of nature.

This is why, personally, I prefer MWI *as an interpretation of quantum theory*, in that it tries to follow as faithfully as possible the fundamental axioms of quantum theory. But I realize that this means that we apply those axioms at scales far beyond its proven domain of applicability! Only, we don't have anything else. And moreover, out of this view doesn't come anything that is in blunt contradiction with observation. So is MWI "true" ? No idea ! I don't know if quantum theory applies to human bodies for instance. MWI is probably the view on QM which is most consistent with its formalism and at least, it doesn't lead to any contradiction. But it is of course far from "proven" - in fact, there's no way to prove it, beyond proving quantum theory correct on "larger and larger scales".

That's why I find the claims by Deutsch a bit disturbing: for the moment we haven't gotten any proof that quantum theory is applicable at the human scale or beyond. We even have a serious difficulty: gravity. So we haven't established the applicability of quantum theory at a scale which is assumed in MWI.

I would even say: imagine that we find that QM is limited in scope, and that we have to replace it with something else on a larger scale. That won't mean that we will not be studying QM anymore as an effective theory, just as we still use Newtonian mechanics. I would say that even then, MWI would be a good "view" on that approximate QM. We would treat it with a smile probably, because we might, for instance, know that due to lack of unitarity on a larger scale, the worlds "collapse" or whatever. But I think it would still be the best view on linear quantum theory. Simply because it sticks to its postulates all the way. So maybe MWI is only "valid" for a few milliseconds or whatever in this new theory.
 
  • #115
vanesch, I completely agree with your post. But I didn't notice that Deutsch claimed to have proven MWI or anything like that, he seem to me just strongly supporting, the same view you just presented, that MWI is logically more consistent if we want to accept quantum mechanics literally (and we don't have much alternatives right now).
 
  • #116
xantox said:
How one-photon realizations could be best understood using classical optics, and how classical optics could help explaining aspects of a quantum theory?

This is not the point. I'm simply objecting to the lingo he uses, like:
"The outcome of this experiment depends on events in another universe" ...
while he describes a simple optical interference experiment.

One can have an interpretation hypothesis but don't preach it as being
an absolute truth. In my opinion this is a lack of respect towards those
students who can't yet distinguish between the scientifically proven
facts here and his personal hypothesis / pet theory.


Regards, Hans
 
  • #117
Yes, I know a small of QM.

I like to think that I know a bit about QM, both the physics and the math. Enough to land me a job teaching QM -- I even managed to get my PhD with a dissertation involving QED, and learned my QM at Harvard and Stanford. Why, I've even lectured at Harvard, and the Fermi Lab -- when it was Argonne National Lab -- on relativistic QM and radiative corrections. I feel I'm safe in saying I understand things like superposition, and spin, and how to calculate cross sections, why I even understand both Fermi-Dirac and Bose-Einstein statistics, and complex angular momentum... So,

Frankly, I fail to see how the "axioms -- or whatever you want to call them --" imply the MWI approach, as some seem to imply.

Worse yet, I believe in wave-function collapse; it occurs in people's brains as we gain knowledge of which alternative actually happens. There is absolutely no doubt that such a mental collapse occurs; we've all experienced such a collapse or change in mental state many times. You are stuck for a moment seeing someone you might have known once. Then "Aha, yes that's Ed from my previous job", That is, we get a change in mental state as our knowledge changes. And, people in the neurosciences are understand more and more how this collapse" occurs.

This knowledge-based approach was championed by the Nobelist Sir Rudolph Peierls. It ties into what I like to call the Practical Copenhagen Interpretation -- PCI. That is, use the Schrodinger Eq, or appropriate variations thereof to compute wave functions; and use Born's idea that the absolute square of the wave function is a probability density. Use standard probability theory to continue; leave the collapse to the neuroscientists. That is, QM uses standard probability theory -- what else could it be?

After having written about this in many threads, I'm delighted to find an ally in Fra.

I assume, in a rejoinder, that you can compute 9-j symbols and fractional parentage coefficients, compute, say, a cross section for double pion photoproduction from a hadron, or get the exact solutions to the two-level atom interacting with the quantized E&M radiation field.

Shadow photons? Your explanation appears to be rather disjoint from Deutsch's discussion in, as some denote it, FAR. That his discussion is poetic is open to some doubt.

Regards,
Reilly Atkinson

(Sorry about all that name-dropping. Sometimes it just happens)

JesseM said:
How familiar are you with the mathematical structure of conventional (non-MWI) QM? Do you understand the idea that a quantum system is assigned a quantum state which evolves over time according to the Schroedinger equation, and that each quantum state involves a "superposition" of different possible eigenstates which correspond to particular measurement outcomes, with each measurement "collapsing" the system's state onto one of the eigenstates with a probability of collapsing into any eigenstate proportional to the square of its amplitude in the superposition before the measurement? If you are, then as I understand it the MWI twist on this is that there is no "collapse" on measurement, that the universe is assigned a single state which remains in a massive superposition, and that each macroscopically-distinct element of the superposition will appear as a distinct "world" to its inhabitants. So the question of the number would be somewhat subjective, depending on how coarse-grained a measure of "macroscopically-distinct" you use...the Everett FAQ says in question #11:

The FAQ also says in questions 6, 7 and 19 that worlds do "split" in the sense of their being multiple macroscopically-distinct later states for a single earlier state, so your question 2 wouldn't really apply. As for your own question 3, are you familiar with the Feynman path integral or sum-over-paths formalism in conventional QM, where the probability of measuring a particular outcome is calculated by doing a certain type of sum of all possible pathways leading up to that outcome, and allowing the different pathways to interfere with one another? I think Deutsch's talk about "shadow photons" is just a poetic way of discussing this, but with Deutsch believing that each path is actually taken by an alternate version of the photon.
 
  • #118
Well, if you apply that bluntly to "Joe saw the red light go on" and "Joe saw the green light go on", and you realize that "the red light go on" was: the particle hit detector 1, and "the green light go on" was "the particle hit detector 2", then it is obvious that you arrive very quickly at situations where Joe's situation is described as:
a |Joe saw the red light go on> + b |Joe saw the green light go on>

And it is difficult to interpret this. We know that it will end up in one way or another that Joe has |a|^2 chance to see a red light, and |b|^2 chance to see a green light. We know that Joe won't see both.

Why not interpret the entangled superposition as the situation Joe was in before he saw the light as that state is related to the suprposition under a unitary transformation? If you want to do a measurement on Joe before he saw the light, you can still do that measurement after he interacted with the light as long as he is in that superposition.

If L is the operator that measures what light Joe is seeing with eigenvalues 0 (no light), 1 (red light), and 2 (green light), then the operator ULU^(-1) with U the time evolution operator applied to the superposition shoud yield 0, so this suggests that the (entangled) superposition is just the old Joe (plus environment) who hasn't seen the light yet.
 
  • #119
reilly said:
I assume, in a rejoinder, that you can compute 9-j symbols and fractional parentage coefficients, compute, say, a cross section for double pion photoproduction from a hadron, or get the exact solutions to the two-level atom interacting with the quantized E&M radiation field.
No, I have only an undergraduate education so far. Are these things relevant to the topics under discussion now? Look, I didn't ask you your background because I wanted to start a physics pissing contest, sorry if you were offended but I just asked because of course I have no idea what a given username on this forum might know (unless I happen to remember from previous interactions with them), and my answers to your questions did depend on certain background knowledge.
reilly said:
Shadow photons? Your explanation appears to be rather disjoint from Deutsch's discussion in, as some denote it, FAR. That his discussion is poetic is open to some doubt.
Can you explain what specifically in Deutsch's explanation doesn't fit with the idea that he is granting equal reality to all the paths in the path integral?
 
  • #120
Apologies to JesseM

Yes, I did not need to wave my credentials, and apologize for so doing. When I was in graduate school and then when I was teaching, physics was a contact sport -- more than once I got hammered when giving a seminar, and more than once did the hammer thing myself. I still, after more than 40 years, have a Pavlovian response when there's even a suspicion that my credentials or ideas are being challenged. Much to my chagrin, I do not always keep my cool under such circumstances.

I know there are many who agree with me when I say that you never really understand QM until you have taught it, which means first a dissertation or long paper based on QM. In other words, you have to do it. Book learnin' is not enough. Got to deal with hbars, and 2pis, and signs, and tons of algebra with reality checks.

Your questions are perfectly reasonable.

Forget about shadow photons unless you want to get hooked into a long chain of contradictions.

Regards,
Reilly Atkinson
JesseM said:
No, I have only an undergraduate education so far. Are these things relevant to the topics under discussion now? Look, I didn't ask you your background because I wanted to start a physics pissing contest, sorry if you were offended but I just asked because of course I have no idea what a given username on this forum might know (unless I happen to remember from previous interactions with them), and my answers to your questions did depend on certain background knowledge.

Can you explain what specifically in Deutsch's explanation doesn't fit with the idea that he is granting equal reality to all the paths in the path integral?
 
Last edited:
  • #121
Fra -- Right on

Great to have an ally. The Bohr quote is great -- thanks.
Regards,
Reilly




Fra said:
IMO. The only thing that collapses is the projection that lives inside the observer, which I see no more weird than similar to a bayesian update.

Why my view of the world change when I receive more information about it, is quite obvious. Whatever the world REALLY is, still has to be projected onto my perspective.

I think these words of Niels Bohr's still stands:

"It is wrong to think that the task of physics is to find out how nature is. Physics concerns what we can say about nature...".

/Fredrik
 
  • #122
reilly said:
Yes, I did not need to wave my credentials, and apologize for so doing. When I was in graduate school and then when I was teaching, physics was a contact sport -- more than once I got hammered when giving a seminar, and more than once did the hammer thing myself. I still, after more than 40 years, have a Pavlovian response when there's even a suspicion that my credentials or ideas are being challenged. Much to my chagrin, I do not always keep my cool under such circumstances.
Thanks for that, and no offense taken, I know from personal experience that it's especially easy on the internet to jump to conclusions about someone's tone or about what they're implying.
reilly said:
I know there are many who agree with me when I say that you never really understand QM until you have taught it, which means first a dissertation or long paper based on QM. In other words, you have to do it. Book learnin' is not enough. Got to deal with hbars, and 2pis, and signs, and tons of algebra with reality checks.

Your questions are perfectly reasonable.

Forget about shadow photons unless you want to get hooked into a long chain of contradictions.
Well, I wouldn't disagree with the idea that the best way to learn physics is by doing. But interpretations like the MWI or Bohmian mechanics involve a fair amount of mathematical elaboration too, I think it'd be a mistake to dismiss some element of an interpretation as self-contradictory just based on what one of the interpretation's advocates says in a nontechnical discussion aimed at a broad audience (of course you might also argue that thinking about 'interpretations' is pointless if they make no new predictions, even if the interpretation doesn't involve any internal contradictions).
 
  • #123
reilly said:
Worse yet, I believe in wave-function collapse; it occurs in people's brains as we gain knowledge of which alternative actually happens. There is absolutely no doubt that such a mental collapse occurs; we've all experienced such a collapse or change in mental state many times. You are stuck for a moment seeing someone you might have known once. Then "Aha, yes that's Ed from my previous job", That is, we get a change in mental state as our knowledge changes. And, people in the neurosciences are understand more and more how this collapse" occurs.

This knowledge-based approach was championed by the Nobelist Sir Rudolph Peierls. It ties into what I like to call the Practical Copenhagen Interpretation -- PCI. That is, use the Schrodinger Eq, or appropriate variations thereof to compute wave functions; and use Born's idea that the absolute square of the wave function is a probability density.

But the point is that the absolute square of the wave function isn't ALWAYS a probability density! So when does it *become* a probability density ? This is the main issue which leads to considerations of an MWI approach.
See, in the double slit experiment, when the wave function is a superposition of "is in left slit" and "is in right slit", it is NOT a probability density. For if it were, we wouldn't have any interference, as we would simply have that the final probability to have a hit in position X would be P(X | A) P(A) + P(X|B) P(B) where A stands for slit 1 and B stands for slit 2.

So along the way, the wavefunction evolves from something that clearly is NOT a probability density (when it is at the two slits) to something that IS a probability density (when it "hits the screen"). It is this undocumented "change in nature" of the wavefunction (from "a physical entity which is not a probability" to "a probability") which stands in the way of a pure "bayesian collapse" view on the measurement problem.

You can get around this in several ways. You can insist on the bayesian nature all the way, but then, in order to get out interference, you have to introduce extra elements. That's what the Bohmians do. The particle DID go through just slit 1 or just slit 2, and it was a complicated dynamics with a *physical* wavefunction which NEVER becomes a probability distribution which guides it to the interference image.
Or you can insist on the non-bayesian nature of the wavefunction all the way, accepting it as a purely physical state, and that's what MWI does for instance.
You can also alter the dynamics of the wavefunction, and give it a physical status all the way, at which point the "collapse" is just part of the (non-linear) dynamics.

But you cannot "sneak in a change in PoV" which is implicitly done when the wavefunction goes from (certainly not a probability function) on microscopic level to (a probability function) at the Heisenberg cut, without at least some explanation where this change came about, because the question that comes up then is:
why can't I consider, from the start, an electron orbital as just a probability density to find the electron ? And you know very well that if you do so, that you do not get out the same results than when you keep the electron wavefunction as a wavefunction and not a statistical mixture of positions.
 
  • #124
I agree with part of what Reily says but also with part of what Vanesch says. I don't know if my view is the same as Reilly's all the way but this is what I personally think:

A plain bayesian view _alone_ is insufficient (agreeing with Vanesh), but it's IMO most probably part of progress, but not end of story! That's why I think we need to go back to probability theory and think again, we need more...

IMO, the clearly identifiable problem is that the probability space is not clearly coupled to the current information and history like I think it should, this is why the arbitrary decomposition between coherent or non-coherent mixtures occur. I don't see this as a pure QM problem, it goes deeper and may relax unitarity because the probability space itself is uncertain and dynamical. I think this falls in the category that Vanesch calls the non-linear approach.

This is what I personally see as the route forward in the spirit of "minimum speculation".

vanesch said:
But the point is that the absolute square of the wave function isn't ALWAYS a probability density! So when does it *become* a probability density ?

I'd choose to say it becomes a probability density when more information about the "microstructure" of our system is collected, or when the probability space has stabilised (more or less the same thing). But of course this is a non-unitary process. The evolution isn't effectively unitary until the probability space has stabilised.

This is reasonably similar to some decoherence based ideas but one still need to quantify in terms of knowledge how certain we are that we have a closed system? usually we don't know and I consider that a fact, it just _seems to be closed_, and I suggest we quantify this. Also considering the system + observer, doesn't solve the point, it just moves the point.

I imagine the "probability space" as something that is emergent as the observer evolves in the environment, once the observer is equilibrated in it's environment I think the residual uncertainty will most probably be the ordinary unitary QM. But that's clearly a special case.

I can't wrap my head around unitary evolution in the general case. The unitary pictures needs to be selected.

/Fredrik
 
  • #125
I no longer understand half of this thread:P but anyway thanks a lot guys.
I've decided not to put too much faith into MWI, not only because it's bizarre, but I was handed some evidence and tests done against it which convince me it's bull****.
I think MWI is built pure on wishful thinking of some and convictionby others.
Since there's NO proof or anything indicating MWI to be true, I think I'll go on living in reality without "reconsidering" how it works:PP

Thanks a lot
 
  • #126
Confused, I'd suggest you try to spend some time thinking on your own. There is usually no replacement for thinking on your own and coming up with our own conclusions. Learning from others is great but some things you just have to decide for yourself. If you feel you can not make such a decision I suggest you read up on the foundations and the philosophical issues with measurements and go through the pain :) Chances are you haven't appreciated the question to start with, when faced with the possible answer to these questions when then of course make no sense because it's difficult to relate to.

/Fredrik
 
  • #127
Hans de Vries said:
This is not the point. I'm simply objecting to the lingo he uses, like: "The outcome of this experiment depends on events in another universe" ...
while he describes a simple optical interference experiment. One can have an interpretation hypothesis but don't preach it as being an absolute truth.

OK, I agree that he should use some precautions in introducing the interpretation. However, concerning the experiment, in lecture 2 he's describing a single-photon setup, so as it can be correctly viewed as a quantum computation without classical analog.
 
  • #128
Fra said:
Confused, I'd suggest you try to spend some time thinking on your own. There is usually no replacement for thinking on your own and coming up with our own conclusions. Learning from others is great but some things you just have to decide for yourself. If you feel you can not make such a decision I suggest you read up on the foundations and the philosophical issues with measurements and go through the pain :) Chances are you haven't appreciated the question to start with, when faced with the possible answer to these questions when then of course make no sense because it's difficult to relate to.

/Fredrik
Good advice.
it's just, MWI, seems retarded to me, from common sense and what i know of science and quantum physics someone on here provided me links to proof against MWI so.
Dno why it's even wasted more money on researching, guess its like... faith lol:P

I maen in one universe, you'd take lastic surgery to look like ur mom after rapin and killin her, just because it's physically possible. I maen seriously how the **** can anyone believe in **** like this
 
Last edited:
  • #129
Vanesch, have you been thinking about writing a long pedagogic paper on MWI where all FAQs would be clearly and honestly answered? I strongly believe that you could do that very well, perhaps better than any already existing paper on that subject. Next time when somebody asks a question, you just tell him - go and read Section XX in my paper on arXiv. If you do that paper well (as I am sure you could) you could even publish this paper in a journal like Foundations of Physics.
 
  • #130
confusedashell said:
Good advice.
it's just, MWI, seems retarded to me, from common sense and what i know of science and quantum physics someone on here provided me links to proof against MWI so.
There are no "proofs" against MWI around. Also, common sense is useless, since many scientific theories like relativity, also go against common sense and they are nevertheless correct. There are two kind of people who could be interested in MWI. The first, is scientists and philosophers of science, and they of course need to deal with MWI seriously, and it is not a waste of time at all. The second, is everyone of us in their daily life, and in this case you don't need anything like an interpretation of quantum theory as a foundation for your actions and moral life like you're trying to. Better go around with your girlfriend!

confusedashell said:
I maen in one universe, you'd take lastic surgery to look like ur mom after rapin and killin her, just because it's physically possible.
No, you would not do that even in MWI.
 
Last edited:
  • #131
if u don't do every possible action then MWI is ultimately wrong. so mwi is ultimately wrong. Common sense:) 2+2 still gon be 4 even if some scifi professor makes up some untestable claims.
Untesteable = non science in my eyes. Philosohpy at best in MY eyes.

I declare MWI dead and won't waste more time on it, but please use this thread to share ur ideas for and against it:)

http://www.boingboing.net/2004/04/26/many-worlds-theory-i.html << against MWI
 
  • #132
If MWI is the wrong answer, to what question is it the wrong answer? :) All I know is that it does not seem like the answer to any questions of mine. But then I need to be humble enough to confess I can not and should not speak for others.

Perhaps the trick is to pose the right question, and then MWI may seem more reasonable. My heavy impression from talking to people on here as well as reading the posts is that ultimately we disagree on the formulating of the core questions and what is the eye of the problems that determines the direction of our efforts. This is reflected in the questions we ask.

Like has been pointed out many times, choosing the questions and finding the answers are closely related.

/Fredrik
 
  • #133
confusedashell said:
if u don't do every possible action then MWI is ultimately wrong. so mwi is ultimately wrong. Common sense:)
Wrong :-). Look, when you decide to do something, you're using your rational faculties, isn't it? You have some reason to do it, it's not like you do it at random. Just like a computer will say that 1+1=2 in almost all universes, and not any result at random. So even in MWI, in almost all other universes, you will use your reason in exactly the same way, so as you will not do nasty things if you are not a nasty person to start with.

PS/ That boingboing article is relating known factoids which were proven wrong. Afshar experiment didn't invalidate anything, since standard quantum theory predicts exactly the same results. Since MWI has the same predictions as standard quantum theory, it can't be invalidated by such experiment.
 
Last edited:
  • #134
confusedashell said:
if u don't do every possible action then MWI is ultimately wrong. so mwi is ultimately wrong. Common sense:) 2+2 still gon be 4 even if some scifi professor makes up some untestable claims.
Untesteable = non science in my eyes. Philosohpy at best in MY eyes.

I declare MWI dead and won't waste more time on it, but please use this thread to share ur ideas for and against it:)

http://www.boingboing.net/2004/04/26/many-worlds-theory-i.html << against MWI

This famous "Afshar experiment" and its erroneous analysis was what Hans de Vries was referring to earlier, and reduces to a classical optics experiment, misunderstood.

The so-called transactional interpretation has 2 problems: first of all, things need to come "from the future", and second, I've never seen it being expanded beyond the single-particle situation. Now, maybe I'm wrong on the second point.

In any case, there cannot be an invalidation of MWI without also an invalidation of quantum theory on a certain level. It is not impossible, and it would surely be interesting news, but as long as quantum theory is assumed correct, there's no way to invalidate MWI. Copenhagen is even worse, because Copenhagen can handle it both ways (it has the extra freedom of choosing where to put the Heisenberg cut).

So you will NEVER read that MWI has been falsified. You might read one day that *quantum theory* has been falsified. MWI falls then with it. Copenhagen can still survive.

People claiming that MWI has been proven, or that MWI has been falsified, independent of quantum theory itself, are obviously not knowing what they are talking about: it is an *interpretation* of the formalism. As such, it makes exactly the same predictions as the formalism. Only, MWI needs the validity of the quantum formalism for macroscopic systems, which might not be applicable. If it is not applicable, then we've found a LIMIT to the applicability of the *quantum formalism*. That's much bigger news than an interpretation being correct or not!
 
  • #135
vanesch -- I disagree; I have no idea why the absolute square of a wave function can be anything else but a probability density, given Born's ideas.

Your probability density argument disagrees with both QM and classical electrodynamics; with any system subject to a linear wave equation. Huygens' Principle guarantees that the absolute square of the wave function, or intensity , contains interference terms. Why? Always, according to Huygens, a solution to a wave equation can be expressed as a sum, so the intensity contains interference terms -- if the sum has two or more terms.

Another way to look at it is to go from a scattering problem -> Fraunhofer Diffraction.

Forgetting that initial conditions for a diffraction problem are tricky, one could state that the initial value of the wave function is 0, except for two disjoint finite intervals with values, w1 and w2. Use a Feynman propagator, the Lippman-Schwinger or resolvant approaches, and effectively what you see is sort of equivalent to two beams, one from each slit, and these beams interact and thus scatter.

Re Bayesian collapse: we talk about the probabiity of an event -- or more than one. All QM normally does is to predict an event, and nothing after the event. There's no real problem of Bayesian collapse or inference. Suppose you are in traffic and figure the odds of making the next light, are, say 2-1. Once you get to the light, you know, so the probability estimate is irrelevant -- unless you are considering a series of lights, or ..

And the QM dynamics guarantee that the norm of the solution is invariant under time translations , except for time dependent Hamiltonians,..., so, it seems to me, that

1. QM gives a perfectly reasonable probability system -- with wave functions or more generally, density matrices. The absolute square of any solution of a linear wave equation provides a valid probability measure.

2. If, as we do in probability, talk about events, much of the mystery of measurement goes away.Prob. theory assumes only that we can recognize events, and distinguish between them. Einstein talked about events and their relationships, but didn't say much about the details of reading a clock. It seems to me that if we keep things simple we can do much of what we want to do -- simple being the use of probability theory's and Einstein's events, and assuming that we can make the appropriate measurements.

3. A theory of measurements is, in my view, way beyond us. But that's not a problem as we do pretty well without one- true classically and "quantumly". Would it be better to have such a theory? Yes, of course.

Regards,
Reilly
vanesch said:
But the point is that the absolute square of the wave function isn't ALWAYS a probability density! So when does it *become* a probability density ? This is the main issue which leads to considerations of an MWI approach.
See, in the double slit experiment, when the wave function is a superposition of "is in left slit" and "is in right slit", it is NOT a probability density. For if it were, we wouldn't have any interference, as we would simply have that the final probability to have a hit in position X would be P(X | A) P(A) + P(X|B) P(B) where A stands for slit 1 and B stands for slit 2.

So along the way, the wavefunction evolves from something that clearly is NOT a probability density (when it is at the two slits) to something that IS a probability density (when it "hits the screen"). It is this undocumented "change in nature" of the wavefunction (from "a physical entity which is not a probability" to "a probability") which stands in the way of a pure "bayesian collapse" view on the measurement problem.
 
  • #136
xantox said:
OK, I agree that he should use some precautions in introducing the interpretation. However, concerning the experiment, in lecture 2 he's describing a single-photon setup, so as it can be correctly viewed as a quantum computation without classical analog.

Despite the single photon setup, nothing in the outcome of the experiment
depends on the quantum behavior of photons. That is, an experiment with
bursts of sound waves or water waves would lead to the same interference
result.

An experiment which does show single photon quantum behavior for instance
uses two detectors at both outputs of a beamsplitter (typically a Wollaston
prism) and demonstrate that only one of the two detectors goes of after
a single photon went through the beamsplitter.


Regards, Hans
 
  • #137
Hans de Vries said:
Despite the single photon setup, nothing in the outcome of the experiment depends on the quantum behavior of photons. That is, an experiment with bursts of sound waves or water waves would lead to the same interference result.
Yes, but since in this setting there is no water, I think we have to explain what happens by using single photons, and here the classical explanation fails. It is not about understanding interference, since it's a course in quantum computation. If we send each photon through a beam splitter, followed by a mirror, followed by a beam splitter, and try to interpret this by using classical bits, we would have a coin flip followed by a NOT operation followed by a coin flip, which doesn't yeld the identity operation performed by the quantum system.
 
  • #138
reilly said:
vanesch -- I disagree; I have no idea why the absolute square of a wave function can be anything else but a probability density, given Born's ideas.

Your probability density argument disagrees with both QM and classical electrodynamics; with any system subject to a linear wave equation. Huygens' Principle guarantees that the absolute square of the wave function, or intensity , contains interference terms. Why? Always, according to Huygens, a solution to a wave equation can be expressed as a sum, so the intensity contains interference terms -- if the sum has two or more terms.

The difference between a classical wave superposition and a quantum interference is the following: in the classical superposition of a wave from slit 1 and a wave from slit 2, we had not a PROBABILITY 50% - 50% that the "wave" went through slit 1 or slit 2, it went physically through BOTH. In classical physics, intensity has nothing to do with probability, but just the amount of energy that is in a particular slot. HALF of the energy went through slit 1 and HALF of the energy went through slit 2 in the classical wave experiment.

But we cannot say that "half of the particle" went through slit 1 and "half of the particle" went through slit 2, and then equate this with the *probability that the entire particle* went through slit 1 is 50% and the probability that it went through slit 2 is also 50%, because if that were true, we wouldn't have a LUMPED impact on the screen, but rather 20% of a particle here, 10% of a particle there, etc...

Now, it is so that the application of the Born rule to a quantum optics problem gives you in many cases just the classical intensity as a probability, which instores somehow the confusion between "classical power density" and "quantum probability" but these are entirely different concepts. The classical power density has nothing of a probability density in a classical setting, and a probability has nothing of a power density in a quantum setting.

The classical superposition of waves has as much to do with probabilities as the superposition of, say, forces in Newtonian mechanics. Consider the forces on an apple lying on the table: there's the force of gravity, downward, and there's the force of reaction of the table, upward. Now, does that mean that we have 50% chance for the apple to undergo a downward fall, and 50% chance for it to be lifted upward ? This is exactly the same kind of reasoning that is applied when saying that two classical fields interfere is equivalent to two quantum states interfering. The classical fields were BOTH physically there (just as the two forces are). Their superposition gives rise to a certain overall field (just as there is a resultant force 0 on the apple). At no point, a probability is invoked.

But in the quantum setting, the wavefunction is made up of two parts. If this "made up of two parts" is interpreted as a probability at a certain point, you'd be able to track the system on the condition that case 1 was true, and to track the system on the condition that case 2 was true. But that's not what is the result when you compute the interference of the two parts. So when the wavefunction was made up of 2 parts, you CANNOT consider it as a probability distribution. But when the interference pattern is there, suddenly it DID become a probability distribution. THAT'S where a sleight of hand took place.

Forgetting that initial conditions for a diffraction problem are tricky, one could state that the initial value of the wave function is 0, except for two disjoint finite intervals with values, w1 and w2. Use a Feynman propagator, the Lippman-Schwinger or resolvant approaches, and effectively what you see is sort of equivalent to two beams, one from each slit, and these beams interact and thus scatter.

Absolutely. But classically, that would mean that BOTH beams are physically present, and NOT that they represent a 50/50 probability!


Re Bayesian collapse: we talk about the probabiity of an event -- or more than one. All QM normally does is to predict an event, and nothing after the event. There's no real problem of Bayesian collapse or inference. Suppose you are in traffic and figure the odds of making the next light, are, say 2-1. Once you get to the light, you know, so the probability estimate is irrelevant -- unless you are considering a series of lights, or ..

And the QM dynamics guarantee that the norm of the solution is invariant under time translations , except for time dependent Hamiltonians,..., so, it seems to me, that

1. QM gives a perfectly reasonable probability system -- with wave functions or more generally, density matrices. The absolute square of any solution of a linear wave equation provides a valid probability measure.

This is ONLY true when ACTUAL measurements took place! You cannot (such as at the slits in the screen) "pretend to have done a measurement" and then sum over all cases, which you can normally do with a genuine evolving probability distribution.
If the wavefunction decomposes, at a time t1, into disjoint states A, B and C, then, if it were possible to interpret the wavefunction as a probability density ALL THE TIME, it would mean that we can do:

P(X) = P(X|A) P(A) + P(X|B) P(B) + P(X|C) P(C)

and that *doesn't work* in quantum theory if we don't keep any information (by entanglement for instance) about A, B or C. So when we have the wavefunction expanded over A, B and C, we CANNOT see it as giving rise to a probability distribution.

Now, of course, for an actual setup, quantum theory generates a consistent set of probabilities for observation in the given setup. But the way the wavefunction behaves *in between* is NOT interpretable as an evolving probability distribution.
So the wavefunction is sometimes giving rise to a probability distribution (namely at the point of "measurements"), but sometimes not.

Now, you can take the attitude that the wavefunction is just "a trick relating preparations and observations". Fine. Or you can try to give a "physical picture" of the wavefunction machinery. Then you're looking into interpretations.
 
  • #139
What I personally meant with "going back to the origin and axioms of probability theory" for the resolution, and that I thought Reilly also referred to is something very basic, but that I think is causing a lot of problems.

It's the idea that while we cannot with arbitrary precision determine the future, we CAN with arbitrary precision determine the probability for any possibility. This simple statement occurring for example in the first chapter of Diracs "The Principles of Quantum Mechanics" is IMO where we lay ground for the future problems.

Then there are attempts to deal with this problem with density matrixes and non-coherent mixtures and decoherence. But so far I've personally found that insufficient.

This I see touching the philosophy of probability theory and information theory, and is thus not QM-specific as such.

The real problem I see is that the usual argument is the frequentist idea that we can find the probability by carrying out an infinite number of experiments. Why this is highly unsatisfactory should be clear if you consider processing power and information capacity. We can not make imaginary use of information the belongs to the future.

My personal questions starts here. How can we improve the foundation?

I am trying to find a possible revision of the probability formalism which introduces a fuzzier(non-unitary) formalism, where the probabilities as we know them today will be more appropriately seen as "subjective estimated" probabilities, induced from *incomplete information* and thus the probability space itself is not yet known.

The idea I favour is that these estimated probabilities are encoded in the microstructure of the observer, and that the probabilities simply correspond to uncertainty in the observers microstructure and state, which in turn is a reduced projection of the environment.

This will I hope yield standard probability as emergent. And the idea is to assign also a physical basis for the formalism itself, and the probability space. Usually this is carelessly done and one imagines infinite amounts of data and infinite experiments to justify the probability, but rarely does one consider in what physical structures this information is encoded? Some decoherence ideas considers the environment as an information sink, which records everything in correlactions, but that is also a problem because the observer only sees a fraction of the environment. So there is still something missing.

Reilly is this anything like you had in mind as well or what did you mean with "going back to probability theory"

/Fredrik

Edit: Another issue is that I don't think it's in general valid to consider the environment as an infinite information sink. One can imagine a case where the observers informationcapacity is comparable to the environment, then the sink idealisation fails. And I guess the key is that an observer does not have an a priori knowledge of the size of the environment where it's immersed. The only way to find out is to interact and try to learn something.
 
Last edited:
  • #140
Fra said:
Edit: Another issue is that I don't think it's in general valid to consider the environment as an infinite information sink.

This is probably a very good approximation when it regards human particle physics experiments, because then in effect we control and monitor the effective environment of the localised experiment and we could probably quite well in principle inform us about correlations in the environment with the system.

But this idealisation doesn't seem near as appealing if one considers cosmology or gravity interactions, or cases where the observers complexity is far less than the complexity of the system under study.

/Fredrik
 
  • #141
i'm probalby doing a mistake by adding this but...

vanesch said:
you do not get out the same results than when you keep the electron wavefunction as a wavefunction and not a statistical mixture of positions.

We attack this differently but I think I see Vanesch point here. I am not quite prepared to present my view consistently yet (I'm working on it) but my information theoretic ideas to explain the connection between statistical (impure) mixtures and pure mixtures depends on how we define "addition of information".

I technically see different measurements, belonging to different probability spaces. And the exact relation between the spaces is needed to define addition of information.

Since normally conditional probabilities is defined like

<br /> P(A|B) := \frac{P(A \cap B)}{P(B)}<br />

The question is what P(A|B) supposedly means unless they belong to the same event space?

This is IMO a _part_ key to explain the reason for different ways to "add information". And when we are dealing with systems of related "probability spaces" in between which we make transformations, then it does matter in which space we make the addition. This is something that isn't analysed to satisfaction normally. Normally the defining relation between momentum and position are postulated, one way or the other. This postulates away something I personally think there is a better explanation to.

I think there is a way to grow a new space based on the uncertainty in the existing one. New relations are defined in terms of patterns in the deviations of existing relations, and relations are "selected" as per a mechanics similar to natural selection. The selected relations implements an efficient encoding and retains a maxium amount of information about the environment in the observer.

The limit is determined by the observers information capacity, and beyond that the only further "progress" that can be made is for the observer/system to try to increase it's mass. Here I expect a deep connection to gravity.

Constraining the information capacity is also I think the key to staying away from infinities and generally divergent calculations. There will be a "natural cutoff" due to the observers limited information capacity.

I'm sorry if this makes no sense, but I hope that given more time I'll be able to put the pieces together, and it's designed to be all in the spirit of a minimum speculation information reasoning that I personally consider to be the most natural extension to the minimalist interpretation.

/Fredrik
 
  • #142
Fra said:
I technically see different measurements, belonging to different probability spaces.
I'm skeptical. Example, please?
 
  • #143
Hmmm... you should be skeptical of course :) And I haven't yet finished this enough to present this properly, this is why I should perhaps be quiet, but consider different "measurements" (operators if you like), then each such measurements can loosely speaking be though of to span a probability space of it's own (the event space defined by the span of the possible events) - but how does different measurements relate to each other?

Then usually the relation between these operators are postulated as commutator relations etc, but if you instead consider the measuremetns or "operators" are dynamical objects then new possibilities open up. But how has the world evolve? does nature postulate relations? I think there is a more natural way view the relations.

However I can't give a complete explanation to this atm, that's why I perhaps should be quiet. I was curious if Reilly recognizes any of this or not. But from his last comments I fear not.

/Fredrik
 
  • #144
comment on immature ideas

An example would be measuring position and measuring momentum.

We could postulate existence of two different observables and then postulate their relation, or do we postulate one observable and try to see if there is a principle that induces naturally complementing observables in a context of the first one an as needed basis?

If we take the set of distinguishable configurations (x) to make up configuration space, then all we can do is to see what the relative frequency is of the retained history. But what if we are to try to find a transformation, and use part of our memory (state of the observers microstructure in the general case) to store a transformed pattern which defines a new event space, could that give us lower uncertainty?

This transformation then relates two event spaces (probability spaces).

We can use a Fourier transform to define momentum (p) and try to find momentum patterns, but why Fourier transform? Is there a principle which renders this transform unique?

Given a limited information capacity, it seems the possible transformations must also be limited and transformation/patterns will be induced spontaneously. I have no answer on this, but my but feeling says there is one and I'm looking for it.

But there is intercommunication between the spaces, they can shrink or inflate at the expense of the other, as well as grow new dimensions, and there is supposedly a particular configuration that is "most favourable" in a given environment, where the observers microsctructure is selected for maximum fitness given the limited information capacity.

What I am trying but haven't yet succeeded with is to build structures starting from a minimalist starting point of a boolean observable which can be seen as an axiom claiming there is a concept of distinguishability defined to each observer. From that complexity and new structures are built as the observer is interacting with the environment. The relations formed are selected by the interaction with the environment.

So higher dimensions are built as extensions and aggregates from a basic notion of distinguishability. The dimensions and structures are emergent in the learning/evolution process. Structures selected represents an efficient encoding of information. Inefficient structures are destabilised and will loose information capacity because they consistently fail to keep up with the environmental perturbations.

Anyway the idea is that then all observables are connected by a natural connection, and we can "add information" from any part of the structure but the actual "additions" would have to be made by respecting the selected connections. It may give rise to apparently twisted statistics, but when seen from the unified view, with all the information transformed into the original event space, I think it will take on a simple view.

This when understood isn't just "an interpretation" that makes no difference, it could be realized as a complete self correcting strategy, which is why I personally like it.

My point in mentioning admittedly incomplete and in progress ideas is only to provoce the questions and possible ways in the spirit of information ideas and talking about going back to probabilit theory. I still think we choose to focus on different things.

I have no idea how long time it takes to mature this. I wouldn't be surprised if it takes several years considering that this is my hobby time, but I know there are other who share the ideas, and I look forward to seeing more work along these lines so I like to encourage any ideas relating to this, because I realize it's a minority idea.

/Fredrik
 
  • #145
small note

vanesch said:
P(X) = P(X|A) P(A) + P(X|B) P(B) + P(X|C) P(C)

and that *doesn't work* in quantum theory if we don't keep any information (by entanglement for instance) about A, B or C. So when we have the wavefunction expanded over A, B and C, we CANNOT see it as giving rise to a probability distribution.

I guess this obvious but a reason for this can still be seen within normal probability is because it's using yet another hidden implicit condition, namely the microstructure/background(M) that A B and C refers to. Making this dependence explicit we get

P(X|M) = P(X|A) P(A|M) + P(X|B) P(B|M) + P(X|C) P(C|M)

but
<br /> P(X) = \sum_{M} P(X|M)P(M)<br />

If this is a sum over all possible background.

and thus
<br /> P(X) \neq P(X|A) P(A|M) + P(X|B) P(B|M) + P(X|C) P(C|M)<br />

unless A B and C make a complete partitioning, which is exactly related to the selection of a particular background M.

I ultimately envision that this background makes up the identity of the observer, and contains a selection of retained history.

The only real problem here is how does one in a sensible way resolve the set of all possible backgrounds? This is what I've been given some thought and this problem is what gives rise to dynamics. It is not possible to resolve all possible backgrounds just like that, since it somehow must involve information storage and computing. This is why i think that summation must be intrisically related to change and ultimately time. Given any set of microstructures, one can always raise the same complaint. Which suggests that a sensible solution a non-unitary evolution.

The ultimate reason for this is that we do not easily "measure our probability space" without getting into silly things like infinite experiments and infinite data storage.

/Fredrik
 
  • #146
vanesch:

I'm looking at the mathematical properties of wave equation solutions. In particular, that wave function norms are preserved by wave equations guarantees that the absolute square of any normalized wave function can be viewed as a probability measure, we'll call it QP for quantum probability, energy density for others, intensity, or what have you. I agree that, particularly in classical physics, the probability density associated with a wave equation has only a formal meaning, and is usually differently interpreted.

Thus, it seems to me that the QP is indeed always a probability density, given it's mathematical properties.. Why not?

vanesch said:
The difference between a classical wave superposition and a quantum interference is the following: in the classical superposition of a wave from slit 1 and a wave from slit 2, we had not a PROBABILITY 50% - 50% that the "wave" went through slit 1 or slit 2, it went physically through BOTH. In classical physics, intensity has nothing to do with probability, but just the amount of energy that is in a particular slot. HALF of the energy went through slit 1 and HALF of the energy went through slit 2 in the classical wave experiment.

But we cannot say that "half of the particle" went through slit 1 and "half of the particle" went through slit 2, and then equate this with the *probability that the entire particle* went through slit 1 is 50% and the probability that it went through slit 2 is also 50%, because if that were true, we wouldn't have a LUMPED impact on the screen, but rather 20% of a particle here, 10% of a particle there, etc...
Why would I want to say, "half the particle went through slit 1,...?" Such a statement makes no sense, except, perhaps, as highly figurative language.
In the symmetrical case, the initial conditions for slits state that the probability to find A in the left slit is equal to the probability to find A in the right slit. It's either there or here, but not in both places. The way we describe such a case is with probability in a specified portion of configuration space. As I noted above, I've been talking mainly about the mathematical parallels amongst wave equations. The interpretation of the equations and solutions can be all over the map.

[/QUOTE]
vanesch said:
Now, it is so that the application of the Born rule to a quantum optics problem gives you in many cases just the classical intensity as a probability, which instores somehow the confusion between "classical power density" and "quantum probability" but these are entirely different concepts. The classical power density has nothing of a probability density in a classical setting, and a probability has nothing of a power density in a quantum setting.

Right, I agree. As I noted above, I've been talking mainly about the mathematical parallels amongst wave equations. The interpretation of the equations and solutions can be all over the map.

vanesch said:
The classical superposition of waves has as much to do with probabilities as the superposition of, say, forces in Newtonian mechanics. Consider the forces on an apple lying on the table: there's the force of gravity, downward, and there's the force of reaction of the table, upward. Now, does that mean that we have 50% chance for the apple to undergo a downward fall, and 50% chance for it to be lifted upward ? This is exactly the same kind of reasoning that is applied when saying that two classical fields interfere is equivalent to two quantum states interfering. The classical fields were BOTH physically there (just as the two forces are). Their superposition gives rise to a certain overall field (just as there is a resultant force 0 on the apple). At no point, a probability is invoked.

But in the quantum setting, the wave function is made up of two parts. If this "made up of two parts" is interpreted as a probability at a certain point, you'd be able to track the system on the condition that case 1 was true, and to track the system on the condition that case 2 was true. But that's not what the result is when you compute the interference of the two parts. So when the wave function was made up of 2 parts, you CANNOT consider it as a probability distribution. But when the interference pattern is there, suddenly it DID become a probability distribution. THAT'S where a sleight of hand took place.

What are the two parts?

Below, I beg to differ. That is, I said "effectively" and "sorta equivalent" to two beams -- we are talking a metaphor, helps some of us in understanding the two slit situation. That is, the two slit expt.. is much like a scattering one -- the mathematics describing the two slit QM experiment is very similar to that for scattering..-- look at the Lippman-Schwinger eq. for example. Again, it's as if there were two beams emanating from the slits.
vanesch said:
Absolutely. But classically, that would mean that BOTH beams are physically present, and NOT that they represent a 50/50 probability!

This is ONLY true when ACTUAL measurements took place! You cannot (such as at the slits in the screen) "pretend to have done a measurement" and then sum over all cases, which you can normally do with a genuine evolving probability distribution.
If the wave function decomposes, at a time t1, into disjoint states A, B and C, then, if it were possible to interpret the wave function as a probability density ALL THE TIME, it would mean that we can do:

P(X) = P(X|A) P(A) + P(X|B) P(B) + P(X|C) P(C)

and that *doesn't work* in quantum theory if we don't keep any information (by entanglement for instance) about A, B or C. So when we have the wave function expanded over A, B and C, we CANNOT see it as giving rise to a probability distribution.

Now, of course, for an actual setup, quantum theory generates a consistent set of probabilities for observation in the given setup. But the way the wave function behaves *in between* is NOT interpretable as an evolving probability distribution.
So the wave function is sometimes giving rise to a probability distribution (namely at the point of "measurements"), but sometimes not.
,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
Now, you can take the attitude that the wave function is just "a trick relating preparations and observations". Fine. Or you can try to give a "physical picture" of the wave function machinery. Then you're looking into interpretations.
Conditional probabilities are a bit tricky at the wave function level, but not so much at the density matrix level. In your example,

P(X) = | WA(X)+WB(X) +WC(X)| ^^2, where WA is the wave function for state A.

Or, P(X) = WA^^2+ 2*WA*WB + …..

(I've dropped the X and the complex conjugate signs)

Now this is a quadratic form that can be diagonalized, which means three disjoint states with probability densities once they are properly normalized. These states are linear combinations of all states, A,B,C, These disjoint states are nicely described by the standard Bayes Thrm , which you site above, and contain interference terms between A,B,C.

Regards,
Reilly
 
Last edited:
  • #147
reilly said:
Thus, it seems to me that the QP is indeed always a probability density, given it's mathematical properties.. Why not?

Well, for the very reason I repeat again. If we take the wavefunction of the particle, and we let it evolve unitarily, then at the slit, the wavefunction takes on the form:
|psi1> = |slit1> + |slit2>
which are essentially orthogonal states at this point (in position representation, |slit1> has a bump at slit 1 and nothing at slit 2, and vice versa).

Now, if this is to have a *probability interpretation*, then we have to say that at this point, our particle has 50% chance to be at slit 1 and 50% chance to be at slit 2, right ?

A bit later, we evolve |psi1> unitarily into |psi2> and this time, we have an interference pattern. We write psi2 in the position representation, as:
|psi2> = sum over x of f(x) |x> with f(x) the wavefunction.

This time, we interpret |f|^2 as a probability density to be at point x.

Now, if at the first instance, we had 50% chance for the particle to be at slit 1, 50% chance to be at slit 2, then it is clear that |f|^2 = 0.5 P(x|slit1) + 0.5 P(x|slit2), because this is a theorem in probability theory:

P(X) = P(X|A) P(A) + P(X|B) P(B)

if events A and B are mutually exclusive and complete, which is the case for "slit 1" and "slit 2".

But we know very well that |f|^2 = 0.5 P(x|slit1) + 0.5 P(x|slit2) is NOT true for an interference pattern!

So in no way, we can see |psi1> as a probability density to have 50% chance to go through slit 1 and 50% chance to go through slit 2.

Conditional probabilities are a bit tricky at the wave function level, but not so much at the density matrix level. In your example,

P(X) = | WA(X)+WB(X) +WC(X)| ^^2, where WA is the wave function for state A.

Or, P(X) = WA^^2+ 2*WA*WB + …..

(I've dropped the X and the complex conjugate signs)

Now this is a quadratic form that can be diagonalized, which means three disjoint states with probability densities once they are properly normalized. These states are linear combinations of all states, A,B,C, These disjoint states are nicely described by the standard Bayes Thrm , which you site above, and contain interference terms between A,B,C.

Regards,
Reilly

The point is that a pure state, converted into a density matrix, after diagonalisation, always results in a TRIVIAL density matrix: zero everywhere, and a single ONE somewhere on the diagonal, corresponding to the pure state (which is part of the basis in which the matrix is diagonal).

As such, your density matrix will simply tell you that you have 100% probability to be in the state...

If you don't believe me, for a pure state, we have that rho^2 = rho. The only diagonal elements that can satisfy this are 0 and 1. We also have that Tr(rho) = 1, hence we can only have one single 1.
 
Last edited:
  • #148
vanesch said:
the wavefunction takes on the form:
|psi1> = |slit1> + |slit2>
which are essentially orthogonal states at this point (in position representation, |slit1> has a bump at slit 1 and nothing at slit 2, and vice versa).

I don't see the deduction that they are orthogonal? I consider this an assumption originating from the use of "classical intuition", not something we know.

vanesch said:
because this is a theorem in probability theory:

P(X) = P(X|A) P(A) + P(X|B) P(B)

if events A and B are mutually exclusive and complete, which is the case for "slit 1" and "slit 2".

I agree completely that it's difficult to find the proper interpretation, but before giving up I have two objections.

(1) Assumption of mutual exclusive options.

In general we have
<br /> P(x|b \vee c) = \left[P(x|b)P(b) + P(x|c)P(c) - P(x|b \wedge c)P(b \wedge c) \right]\frac{1}{P(b \vee c)}<br />

(2) Ambigousness in mixing information from different probability spaces.

Even accepting the forumla above, it seems non-trivial to interpret this in terms of

<br /> \psi\psi^* = \frac{|b|^2|\psi_b|^2 + |c|^2|\psi_c|^2 + 2Re(bc^* \psi_b \psi_c^*) }{\sum_x |b\psi_b(x) + c\psi_c(x)|^2}<br />

IMO, x and b does not belong to the same event space, considered as "simple sets", therefor the definition of P(x|b) needs to be analysed.

P(x|b) is defined only if the conjunction of x anb b i defined.

So there is a problem here, I agree with Vanesch. But I think that the solution is to define the relation between the two spaces where x and b belongs. This relation will provide a consistent definition of P(x|b). Right not P(x|b) is intuitive, but the formal definition is not - it is dependent on a background defining the relation between the two observables. This is where I am personally currently looking for the connection that will restore a consistent "probabilistic" reasoning. I think we both need a revision of priobability theory itself, in the sense that he probability space NEED to be allowed to by dynamical, anything else just seems hopeless AND we need to understand how the relations between observables are formed. Postulating relations are guesses, I suggest this guessing is made more systematical.

/Fredrik
 
Last edited:
  • #149
Fra said:
I don't see the deduction that they are orthogonal? I consider this an assumption originating from the use of "classical intuition", not something we know.

No, it is pretty simple and straightforward. Consider the slits on the x-axis: slit one is from x = 0.3 to x = 0.4 and slit 2 is from x = 1.3 to 1.4.
The wavefunction at the slit, psi, will be a function that is zero outside of the above intervals. So we can write psi(x) as psi1(x) + psi2(x), where psi1(x) will have non-zero values between 0.3 and 0.4, and be 0 everywhere else, while psi2(x) will be non-zero only between 1.3 and 1.4, and zero everywhere else.

As such, psi1(x) and psi2(x) are orthogonal, because integral psi1(x) psi2(x) dx is 0.

I agree completely that it's difficult to find the proper interpretation, but before giving up I have two objections.

(1) Assumption of mutual exclusive options.

In general we have
<br /> P(x|b \vee c) = \left[P(x|b)P(b) + P(x|c)P(c) - P(x|b \wedge c)P(b \wedge c) \right]\frac{1}{P(b \vee c)}<br />

Yes, but if the wavefunction was to be interpreted as a probability density, then there was 50% chance for psi1 and 50% chance for psi2 and nothing else, because, exactly, psi1 and psi2 are orthogonal. In other words, we can see them as elements of an orthogonal system in which we apply the Born rule (that's what it means, to give a probability density interpretation to a wavefunction: pick an orthogonal basis, and apply the Born rule).
The events corresponding to two orthogonal states are mutually exclusive.

(2) Ambigousness in mixing information from different probability spaces.

Even accepting the forumla above, it seems non-trivial to interpret this in terms of

<br /> \psi\psi^* = \frac{|b|^2|\psi_b|^2 + |c|^2|\psi_c|^2 + 2Re(bc^* \psi_b \psi_c^*) }{\sum_x |b\psi_b(x) + c\psi_c(x)|^2}<br />

IMO, x and b does not belong to the same event space, considered as "simple sets", therefor the definition of P(x|b) needs to be analysed.

I'm fighting the claim that the wavefunction can ALWAYS be interpreted as a probability density. I'm claiming that this statement is not true and that at moments, you CANNOT see the wavefunction as just a probability density. You can in fact ONLY see it as a probability density when "measurements" are performed, but NOT in between, where it has ANOTHER (physical!) meaning, in the sense of things like the classical electromagnetic field. As such, one cannot simply wave away interpretational issues such as wavefunction collapse as just "changes of probability of stuff because we've learned something".

I'm trying to show here that the wavefunction is, most of the time, NOT to be seen as a probability density, but "jumps into that suit" when we say that we do measurements.

We take as basic example the 2-slit experiment, and I'm trying to show that when the wavefunction is at the two slits, it cannot be seen as just saying us that there is 50% chance that the particle went through slit 1 and 50% chance that the particle went through slit 2, but rather that there was something PHYSICAL that went through both, in the same way as a classical EM wave goes through both and doesn't surprise us when it generates interference patterns. The classical EM wave that goes through both slits is NOT to be seen as 50% chance that the lightbeam goes through the first hole, and 50% chance that the lightbeam goes through the second hole, and in the same way, the wavefunction at that point can also not be seen like that.

Now, saying that "it still might be a probability density, but not in the probability space in which we are looking at our results" is an equivalent statement to "it wasn't, after all, a probability distribution", because after all, things are a probability distribution only if they are part of the one and only "global" probability space of events.

ALL quantum weirdness reduces to the fact that quantum-mechanical superposition is NOT always statistical mixture. This is shown here in the double-slit experiment, but it is also the case in more sophisticated situations, including EPR experiments.

It is, for me, the basic motivation to look upon quantum theory through MWI goggles in fact: there, it is obvious that superposition is not statistical mixture ; as such, this view avoids one to fall into the most serious trap, IMO, in looking at quantum theory: namely to interpret too lightly a superposition as a statistical mixture. It is also the reason why MWI is fundamentally different from the "universe of possibilities" that might be postulated in a random but classical universe, and many people don't seem to realize this.
 
  • #150
> it is obvious that superposition is not statistical mixture

Right. In the way you refer to this, this is clearly correct and I see what you are saying. It would be foolish to argue on this. I completely understand the different between superposition of wavefunction vs incoherent mixtures of classical probability distributions.

But we at least seem to disagree on how to resolve this and what conclusions to draw, or part of the problem is I think that _maybe_ we need terminology and maybe we misunderstand each other, because were both trying to extend the current description.

If we are going right by the book I would say that the Kolmogorov probabilitiy theory and it's axioms are an imperfect choice of axioms to describe nature. The most severe objection is howto attach the probability space to observation. This space is (in standard QM) not measured. Instead there are a lot of silly ideas of infinite measurements. This is one problem, but there are more of them.

Maybe I caused the confusion, but IMO there difference lies not in probability theory mathematically, but in the application of the axioms to reality. It depends on how you see it. As you know there has been philosophical debate of the meaning of probability, having nothing to do with quantum mechanics. I'm relating to this as well.

Now, saying that "it still might be a probability density, but not in the probability space in which we are looking at our results" is an equivalent statement to "it wasn't, after all, a probability distribution", because after all, things are a probability distribution only if they are part of the one and only "global" probability space of events.

You can put it that way. We are getting closer.

What I am suggesting is that there never exists a certain probability space, because the probability space is grown out of the retained history. But one can actually still define a conditional probability, that is defined, not as the predicted relative frequency of an event - because this can never be known, only guessed - rather this conditional probability is technically and expectation of a probability, based on an expected probability space. This is becauase the set of events and their relations are generally uncertain and changing.

I guess what I am saying is that in the way I try to see things, I see no such thing as a definite probability, for the reason that neither the probability itself, nor the probability space can be measured with infinite precision. One way to appreciate this is if you think that the observers ability to retain information is limited.

My standpoint has originated from trying to understand the process of inducing a probabiltiy space from data. This is related to datacompression, but also to responsetimes to change the estimates time. In essence the probability space is under constant "drift", as the observers memory is remodeled. The dynamics of the remodelling is like a movie beeing played, that is projected from the environement. The observer can never ever determined a exact probability of an external event, it can only deduce subjective expectations of effective proabilitties - which is better see as a bet distribution in a gaming perspective.

To not mix up language, we can eithre forget about probabilities and invent new lables for it. But the thing is that it will still end up closely related to a probability theory, just a relational one.

I magine that even space is structures and are spontaneosuly formed in observers memory (and since I think Zurek's idea that "What the observer knows is inseparable from what the observer is" is dead on), this is also responsible for "physical structure formation". And in this context, the particle zoo and the spacetime itself should be treated in a in principle equal basis.

Sometimes I wonder if the different universes the MWI people talk about are either the set of all possible "images of universes projected on observers" considering all possible "observers"? If so, we may be closer than what it seems. If that is so, then what I am saying is the the different universes do interact. but calling the universes is a very strange terminology IMO - I would call _that_ the projection of the unknown(some may call this the universe) to the observer. Wether this match is correct or not, I am not entirely sure.

I guess when all of these ideas are worked on, testable predictions will come and then it's probabably also easier to compare them? What I envision, is not just an interpretation, when it's finished I expect a lot of things to pop out. But it seems not so many people are actually spending time on these things, otherwise it's a mystery why not more has been accomplished.

/Fredrik
 

Similar threads

Back
Top