Many-Worlds Interpretation Issue

  • #51
setAI said:
of course- except that Deutsche's arguments are backed up by one of the largest empirical efforts in history: the field of quantum computing- and many of the greatest physicists of our age have PUBLICALLY ENDORSED him- such as Gell-Mann/ Rees/ Hawking/ Lloyd/ not to mention that David Deutsch is probably the most respected by his peers and best funded quantum physicist on the planet right now- and a good bet for the recipient of the Nobel http://www.edge.org/3rd_culture/prize05/prize05_index.html

there simply is no exscuse for the bias against his ideas I have seen on this forum- [and ONLY this forum- especially by mentors] when his ideas about the MWI are now nearly universally excepted/corroborated/empiracally repeated by the actual scientists in the field- except I guess for certain isolated regions in the United States [rather like foreign policy it seems]
You are mixing things up. MWI and qubits haven't anything to do with
each other. The fact that qubits are fashionable nowadays does not mean
at all that all these people buy his QM interpretation.

I have seen fashions coming and going from fifth generation AI computers,
fuzzy logic systems, neural networks, functional languages, dataflow
computers, lisp systems, prolog computers, 1 bit parallel processor
"thinking machines", et-cetera, et-cetera, all interesting academic
exercises, each developing their own mathematical frame work.

Each of them was going to revolutionize the world. Each one was going to
stay for ever and ever. Each time the academic community mastered the
new fashion, they played with it, wrote papers, played, played more,
until the day it became boring and unfashionable and the next fashion
appeared...


Regards, Hans
 
Last edited:
Physics news on Phys.org
  • #52
Neural networks, fuzzy logic, functional languages are all in current use.
 
  • #53
oh- I fully agree that the MWI is at best an incomplete fashionable idea- but it is by far the best we have right now and according to everything I have seen from Deutsch/ Lloyd/ Gel-Mann / Rees suggests that the MWI is curently the only interpretation of QM that we know about that is tenable- and at the moment I accept their overwhleming evidence that the advent of quantum computers has established the MWI and eliminated Copenhagen/hidden Variable interpretations in the same way that Hubble's discovery of the doppler shift established the Big Bang model and eliminated the Steady State- [although this is the realm of interpretations of a single theory- not a competition between different theories- but with just as many implications to the nature of reality]

and so future progress in interpretting QM must proceed from and include the basic postulates of the MWI picture- just as progress in cosmological models has proceeded from and included the Big Bang
 
Last edited:
  • #54
0rthodontist said:
Neural networks, fuzzy logic, functional languages are all in current use.

Qubits will never go away altogether either, most things never will, they
live on with the people who love to use them. However at the end there
are many ways to achieve the same thing. Qubits as a mathematical
framework is not unique either, it's a human invention, not something
which is forced upon us by nature or physics.


Regards, Hans
 
  • #55
setAI said:
and at the moment I accept their overwhleming evidence that the advent of quantum computers has established the MWI and eliminated Copenhagen
How do you manage to keep saying that without ever addressing the (obvious) objection everyone has been raising?


And I just realized a (fatal?) flaw in your argument based on counting particles in the universe. It is based on the fact that the state-space of a collection of n qubits is (2^n)-dimensional.

But there's a big assumption: that the whole state space is "accessible".

To wit, the state at any intermediate stage of the computation is a deterministic function of the state of the input. It requires exactly as many (classical!) bits to represent any intermediate state of computation as it does to represent the input to the computer. (Of course, certain representations will require more bits than necessary)

In fact, a quick google search turns up this:
http://ieeexplore.ieee.org/search/wrapper.jsp?arnumber=694607
which claims that any quantum algorithm that runs in O(s) space can be simulated by a probabilistic classical algorithm that uses no more than O(s) space, or by a deterministic classical algorithm that uses no more than O(s²) space.
 
Last edited by a moderator:
  • #56
setAI said:
oh- I fully agree that the MWI is at best an incomplete fashionable idea- but it is by far the best we have right now and according to everything I have seen from Deutsch/ Lloyd/ Gel-Mann / Rees suggests that the MWI is curently the only interpretation of QM that we know about that is tenable- and at the moment I accept their overwhleming evidence that the advent of quantum computers has established the MWI and eliminated Copenhagen/hidden Variable interpretations in the same way that Hubble's discovery of the doppler shift established the Big Bang model and eliminated the Steady State- [although this is the realm of interpretations of a single theory- not a competition between different theories- but with just as many implications to the nature of reality]

and so future progress in interpretting QM must proceed from and include the basic postulates of the MWI picture- just as progress in cosmological models has proceeded from and included the Big Bang

You, know. I'm all for freedom of religion, the problem is that each religion
always wants to prohibbit all the other ones...:rolleyes:

It's bogus to say that quantum computing proves MWI. The examples
shown on the Deutsch video apply just as well to entirely classical
systems based on the superposition of electromagnetic waves!

This means that classical physics would prove MWI :confused: Regards, Hans
 
Last edited:
  • #57
Hans de Vries said:
Qubits will never go away altogether either, most things never will, they
live on with the people who love to use them. However at the end there
are many ways to achieve the same thing. Qubits as a mathematical
framework is not unique either, it's a human invention, not something
which is forced upon us by nature or physics.


Regards, Hans
Functional languages, fuzzy logic, and especially neural networks aren't "going away" in any sense of the phrase. They are very popular.
 
  • #58
CarlB said:
Yes, I agree with you completely here. I think that quantum mechanics points towards the thing that is underneath, but I do not think that quantum mechanics itself is very close to the thing underneath. Quantum mechanics is as good as we've got at this time, but that doesn't mean that it is an accurate description of reality.

Ok, but I always repeated that in order to give an interpretation to *quantum theory* one shouldn't start by saying that it is wrong, and quickly invent a few properties of a non-existing potentially underlying theory that could explain it all. One should stick to the theory that one is trying to interpret, and start by assuming that it is right ; in other words, imagine a toy universe where said theory is to hold strictly.
It is too easy to speculate about the properties an underlying theory (that you still don't have and that is certainly not experimentally tested) to base any interpretation of an actual, working theory on, no ?

So, my claim is that *in a toy universe where quantum theory is strictly true*, it is hard to avoid an MWI kind of ontology. Now, in how much that toy universe corresponds to ours, that's an open question, and this will always remain an open question. But the exercise was not to say how our universe "really is" (as nobody "really knows" and never will), the exercise was to interpret quantum theory.
 
  • #59
vanesch said:
personally, I don't buy the "quantum theory is JUST an algorithm to calculate probabilities of outcomes". Of course it is an algorithm, TOO. But saying that one should think of it as ONLY that is giving up on the essential part of science, because we've switched from investigating the nature of nature, to "stamp collecting".
We had a long discussion about this. You still haven't gotten my point. Of course there is more than the algorithm. How else could there be so much stuff on my website about the ontological implications of the mathematical formalism of the theory? What I object to is the naive transmogrification of an algorithm (which depends on the time of a measurement) into an evolving instantaneous state of affairs. Going beyond probabilities requires a careful analysis of the quantum-mechanical probability assignments.
 
Back
Top