Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Q-bit computation theory

  1. Oct 31, 2014 #1
    Can anyone help me understand how superposition leads to increased computational ability?

    I'm pretty sure I have all the background in conventional computing covered, I started programming basic 30 years ago and jumped directly into assembly language (position independent machine-code, then completely self-contained programs), then pascal in high school to learn program structure, then on to 80386 coding center-out rendered polygon engines (without a math co-processor) and then stayed current with true multitasking and OOP in Java...

    What I can't seem to comprehend is how 0 or 1 or both is expected to increase computing power so dramatically. I have already contemplated optical processing where a bit (register would be a better description) can be a value from 0 to 1 at whatever frequency the photonic "register" would contain, to the degree of precision that the hardware could discern. Anyone who has a clue what I'm talking about would immediately realize the potential, I'm just not sure if the scheme is physically feasible, yet.

    So how would a string of Q-bit/byte/registers compare? Is the probabilistic nature of superposition expected to fulfill my concept of "0 to 1"?
  2. jcsd
  3. Nov 5, 2014 #2
    If you only have one qubit, you have no advantage. The advantage comes when you have a large system of qubits that support a single set of states.

    With 1000 bits, you can encode any number from 0 to 2^1000-1. With 1000 qubits, in principal, you can encode all numbers from 0 to 2^1000-1 and then manipulate that system to eliminate all multiples of 2, 3, 5, 7, and 11 - or to at least limit their likelihood to near zero. When the bits are "read out" from that system, you would discover that codes for your target numbers are much more likely that codes for those divisible by 2, 3, 5, 7, or 11.

    That's kind of a cursory "Hello World" example. It's probably not a practical example, but it gives you a concept of how qubits can be more powerful than bits.

    The most famous examples of qubit algorithms are Shor's and Grover's Algorithms.

    Shor's is the one with the potential of cracking RSA and is the more famous of the two. But the mathematical method of encoding the data and interpreting the quantum results is a bit sophisticated. If you're not a Quantum Physics or Math major, you've got some studying to do.
  4. Nov 10, 2014 #3
    Thanks for your reply! After some long and hard thought I think I understand where they are trying to go with it and why they haven't had much success.

    The entanglement aspect is irrelevant IMHO, its all about the probabilistic nature of quantum systems. Probably getting an acceptable result in a finite predetermined allotment of time seems like underachieving. In any given calculation there is only one or a set of multiple "correct" answers in which the algorithm needs only multiple pathways constructed in a ground state (equilibrium) and when the "answer" (for example a large number composed of 2 primes) is fed into the processor in a magnetic field only the logic gates which would present the possible "paths" for the correct "problem" would be energized.

    Sorry if that is confusing I'm trying to devise a quantum mechanical device that would exploit the "path of least resistance" philosophy.
  5. Nov 10, 2014 #4
    Suppose we had 2 wheels with prisms around the circumference of refractive index relating to the value of prime numbers from 1 to n and we rotate these wheels in opposite directions with EMR of frequency relative to the value to be factored radiating through these wheels. It would only radiate through the contraption when the primes correlate and the radiation emitted would be at the initial wavelength confirming the result is valid.

    I suppose the same concept would apply electrically if wheels of transformers and resistors relating to prime values only allow a ground state (zero result) be achieved when the number of electrons submitted transforms to an equality.
  6. Nov 10, 2014 #5
    Okay forget the electrical description above, much simpler here: imagine a wheel having conductors rotating in a magnetic field and they connect to transformers of successive multiplication value and you start with the rotation at 1 (relative to the energy created) and increase rotation until one of the circuits reaches the value to be factored.
  7. Nov 10, 2014 #6


    User Avatar
    Science Advisor

    Here's some resources that should help:

    Quantum computing for the determined: An introductory youtube series by one of the authors of https://www.amazon.com/Quantum-Computation-Information-Anniversary-Edition/dp/1107002176.

    Grover's Quantum Search Algorithm: A blog post explaining Grover search all the way from complex numbers to the actual circuit, with animations, by me. I wrote it when I knew a lot less about QC, so it covers the basic confusions better than I can now.

    Basically all the power comes from the fact that manipulating superpositions is equivalent to doing huge huge matrix multiplications on huge huge vectors. But you only have access to a limited set of unitary matrix multiplications (your gates), and you typically have to arrange things so only one entry (the answer) in the final vector gets almost all of the weight. Also, in addition to those two gigantic limitations, there's all the practical engineering difficulties of making these things scale that no one has quite solved yet.
    Last edited by a moderator: May 7, 2017
  8. Nov 10, 2014 #7
    Does what I described seem like a "turing machine" with wheels instead of ribbons?
  9. Nov 10, 2014 #8
    Theoretically speaking, If there is a radial circuit branch for each possible unique logical path (lets say it "resists" incorrect inputs), and you energize this circuit, the path with the least resistance would be energized very quickly indicating the solution path in 1 clock cycle.
  10. Nov 11, 2014 #9
    A hologram can be considered as a form of quantum computation, where the resulting image is the answer to the problem of how to code for more than just a single point of view on a scene. The hologram exploits the super-position properties of light in order to achieve a solution. A traditional photograph can be considered as exploiting only a classical model of light (eg. a light ray model) with the resulting image somewhat more limited in terms of what it reveals of a scene. The photographic film, in both cases, can be the same stock and same size. The quantum mechanical aspect is not in the film as such, but in what aspects of light are being exploited.

    And in both cases the amount of time required to create each image (hologram vs photograph) is not significantly different.

    Now consider the same task undertaken on a computer. Holograms can be computer generated. But on a classical computer it would take much longer to generate a holographic version of a scene, than a traditional single perspective view of a scene. So how can a traditional hologram code such a huge number of viewpoints on a scene in the same amount of time as it takes a traditional photograph to code only one?

    The classical computer isn't exploiting the physical materials of which it could be otherwise built, ie. in any way that would allow it to perform the computation quantum mechanically.

    This is really a very loose analogy, but I think it gives a general sense of the qualitative difference between classical and quantum systems, and the huge difference in power being expected of quantum computation.

    Interestingly the way in which GPUs are programmed shares something in common with the way quantum computers would need to be programmed.

    Last edited: Nov 11, 2014
  11. Nov 12, 2014 #10


    User Avatar
    Science Advisor
    Gold Member

    No, entanglement is crucial and without it you can't have quantum computing. This is also the reason why it is so difficult to build a quantum computer :creating a single qubit is easy but not interesting from a computational point of view, creating an array of individual qubits is also straightforward (you just scale of the single qubit scenario) but creating a large number of entangled qubits turns out to be extremely difficult; the current record is something like 10 which is just about enough to demonstrate that some of the of the algorithms that have been developed are working, but of course these "computers" are of no practical use at the moment.

    Note also that only small number of algorithms that can be used on a quantum computer have been developed. Some of these turn out to be very useful, but for the vast majority of problems a quantum computer will not be better than a classical computer; but it would be an extremely useful "sub processor" for certain classes of problems (database searches, some optimization problems and also for modelling other quantum systems in e.g. chemistry).

    I should perhaps mention that there are different types of quantum computers, the adiabatic quantum computer (the type sold by D-Wave systems) is not a "true" quantum computer in that it can only solve some very specific problems.
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Similar Discussions: Q-bit computation theory
  1. [Q]perturbation theory (Replies: 3)