Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Programming Quantum Computers - Classical Techniques Obsolete?

  1. Jul 26, 2011 #1
    This question has been bugging me. I have a math degree, and my computer knowledge is limited to VERY BASIC programming and being able to build my own PC, so I thought this would be a good place to ask.


    Note: This question has no "clean cut" forum to fit into. I read ALL the forum descriptions and decided this was the best. Please move if necessary.
    __________________________________________________

    I know quantum computing is in its infancy (or for all practical purposes completely out of reach). But, assuming that one day we'll have mass-produced quantum computers, does anyone know how computer science will change? Will all the data structure, computer architecture, programming language, etc. classes be a waste of time and will CS majors have to start over? Specifically, how will this change the high-level programmer? Will quantum programs be able to have a similar syntax to major ones today like C, C++, or Java?

    Hypotheses from knowledgeable people in software programming or computer architecture/engineering is especially appreciated! Thanks in advance!
     
  2. jcsd
  3. Jul 27, 2011 #2

    chiro

    User Avatar
    Science Advisor

    With any form of computation, the two key ingredients are state and flow control.

    Most algorithms have very clear guidelines about how state fits into a particular algorithm.

    The flow control however is the big thing.

    When you think of procedural (I call it linear), then yes you have that method. But there are a group of languages called "non-procedural languages" which have a syntax that admits a different way of doing computation (or at least thinking about it).

    So naturally the big thing is to focus on how to organize the flow control of the actual computation and represent that accordingly.

    In conjunction with this, develop a language that is strikes a balance between ease of use and the ability to do what it was designed to do.

    So I guess in saying this, if useable platforms do become available, there will have to be frameworks that are designed with the flow control in mind.

    In terms of state though, I can't see one getting away from the idea of discrete data though, simply due to how computation works with regard to the requirement that data needs to be clearly defined with respect to other data, and this leads to having decisions which are "sharp" (think a Heaviside function) and this leads to states which are atomic (ie discrete).
     
  4. Jul 27, 2011 #3
    That makes sense. I didn't think about the distinction between state and flow. Would an analogous way to think about this be sort of like sets (states) and functions (flow)? Obviously, those are not the same things, but I'm trying to think of this in terms of combinatorics.

    I don't fully understand why computation must be discrete, but it definitely makes sense on an intuitive level. I wasn't familiar with the Heaviside step function, but I know now a little about it after reading some. Would it also be correct in thinking that this function represents sort of a transition between the discrete and the continuous?

    Thanks for your reply btw!
     
  5. Jul 27, 2011 #4

    chiro

    User Avatar
    Science Advisor

    For the discrete argument, think about the main operators in computation: you have arithmetic, logical, and things like comparison operators.

    Now imagine having non-discrete "registers". In other words the size of the register is infinite since it has the capacity to represent a continuous variable.

    Clearly you can see that you will run into problems. If you apply simple operations like addition, subtract or multiplication, the computation will for the general case never finish.

    The above case assumes the general case, meaning that registers have to represent every possible case and not just specific cases like whole numbers, or surds, but any number.

    So for computation to be feasible you need to have instructions that can be completed in a finite amount of time, and for this to be the case the data (or registers) of the machine have to be discrete or countable.

    With regard to your statement about functions, that is a good way to think about it. The functions however are a function of your complete state space in the general case, but of course for examination of small computations you can reduce this state space to something manageable.

    So in saying that you could think of a function that relates every state to each other. Your time variable would be linked to both the registers and memory of the machine.

    If you are interested in a mathematical view of converting programs to mathematical statements Gregory Chaitin has written a good book about converting LISP programs to statements in number theoretic terms. The equations are enormous, but it should give you an idea of how a conversion actually works and is calculated.

    Basically the idea of Chaitin is that you are running a program to compute something and the possible solutions of that something can be find by solving a massive Diophantine equation.

    With regards to the combinatorics, I don't complete understand what you mean so I can't comment about that.
     
  6. Jul 27, 2011 #5
    I kind of feel stupid after you explained the discrete thing. The explanation makes it obvious, thank you.

    I might check that out. I'm not daunted by big equations. Anything explained in more mathematical terms is easier on my brain.

    By combinatorics I just meant trying to interpret computation in a discrete/graph theoretical manner which would include mostly sets and functions (and to a limited extent operators and certain other categories). But you explained that nicely, so don't worry.

    Much thanks again! I'm glad to have some reference points to go off of now when thinking about these things.
     
  7. Aug 2, 2011 #6

    -Job-

    User Avatar
    Science Advisor

    Quantum Computers are entirely different from conventional computers and are great at solving specific problems, but are either impractical or don't offer substantial gains, in terms of performance, for the types of problems solved by today's computers.

    In Computer Science, the class BQP is associated with the problems that a Quantum Computer is able to solve quickly. The "P" in BQP stands for Polynomial time (with respect to the size of the input to the problem) which is informally synonymous with "fast". Any instance of a problem in BQP may be readily solved by a QC, even for large instances.

    Another class, P, is analogous to BQP but for conventional computers - that is, it contains the problems that today's computers are able to solve efficiently. Many frequently occuring problems live in P, e.g. sorting, searching a list, matrix multiplication, computing the greatest common divisor and even testing for primality. In other words, many of the tasks performed by today's popular software applications are already efficiently solved by laptops and mobile devices - in some cases they're already as fast we need them to be because they act on data sets of almost insignificant size.

    For large data sets, we're able to cluster computers or CPUs to improve performance. For example in Parallel Computing, sorting [itex]n[/itex] numbers with [itex]log(n)[/itex] CPUs is done in [itex]O(n)[/itex] (linear) time, as opposed to the [itex]n \times log(n)[/itex] time it would take for a single CPU. It's faster, but has the additional cost associated with the extra CPUs. Additionally some problems are not efficiently parallelizable - in particular problems that interact with a user or other devices. This is one reason why a working QC won't replace our existing CPUs, there's no inherent gain (with some possible exceptions such as games and math or data analysis software running with ambitious data sets).

    One problem that Quantum Computers are substantially better at solving is Integer Factorization - given a number, output its prime decomposition. This problem is in BQP and outside P, meaning that conventional computers require exponential time to solve it (with respect to the number of bits in the number), whereas QCs are able to solve it in [itex]b^3[/itex] - where b is the number of bits - much faster by comparison. As an example, a large number with hundreds of digits on a QC can be factored in roughly a few million operations, or instantly as perceived by a person. In a conventional computer, factoring the same number requires a number of operations possibly exceeding the estimated number of atoms in the universe.

    This dramatic difference in performance is only visible for very large numbers. Since computer software doesn't routinely have to deal with such large quantities there's no pressing need to switch to Quantum Computers. The more significant impact of QCs would be in cryptography since RSA, a popular algorithm used in public key encryption is vulnerable to fast integer factorization.

    As useful as Quantum Computers would be they're not proven to offer any advantage in solving a class of difficult and frequently occurring problems (called NP-Complete problems). As an example, even a QC would take forever to solve a very large Sudoku - it's curious that lots of games are particularly difficult theoretical problems, GO for instance is completely intractable.

    Quantum Computers are probabilistic (rather than deterministic) machines so there's a very small probability that they'll output the wrong answer, but by using repeated runs we can reduce the probability of error to that of corruption or hardware failure, so it's not very significant in practice. There's a computational class, called BPP (bounded error probabilistic polynomial time) which contains the problems that can be efficiently solved by randomized algorithms running on conventional computers, and some offer improved performance with negligible risk of error.

    There are some QC simulators available which use specific notation, more comparable to assembly than Java/C given the nature of QCs.

    How do you program for Quantum Computers? Suppose you have a particle or quantum bit, with some measurable property with value 0 or 1. Until the value is measured there is some probability of the value being 0 and some probability of the value being 1. If you have n such qubits, you have a probability distribution that indicates the probability of each possible state. If there are n bits then there are [itex]2^n[/itex] states, each possibly with a non-zero probability depending on how the bits arrived at that state.

    The fact that the "Universe" is, or seems to be, tracking the probability of each possible outcome is what Quantum Computers exploit. If we were living inside a computer and that computer were simultaneously tracking every possible outcome in our day-to-day lives, we would be able to exploit this for computation by determining how our actions affect the possible number of outcomes and corresponding probabilities. With each action the universe would have to update this large set of probabilities that it's tracking. Each such update resembles a mathematical operation, an operation that's performed against a set of [itex]2^n[/itex] numbers.

    The idea is to take an initial state of n qubits and apply successive operations to get into a state whose probability distribution matches a useful state (in integer factorization for example, the set of numbers with a given period). Finally we perform additional operations that cause the desired solution to be the most likely state, such that when we do measure the final state it yields the correct solution with high probability. A QC simulator allows you to analyze the probabilities for each state and apply operations corresponding to Quantum gates, so in the end it resembles assembly language more than anything.
     
  8. Aug 3, 2011 #7

    -Job-

    User Avatar
    Science Advisor

    In short, using a Quantum Computer for conventional purposes is like using a nuclear reactor to power a mobile device. It's much more useful in isolated and specific contexts.
     
  9. Aug 4, 2011 #8
    Thank you for that! That answered a lot of lingering questions I still had.

    So, if you were just beginning studies in Computer Science, in order to "future-proof" your knowledge base, would you concentrate on the more theoretical aspects of the field such as computational complexity theory, discrete mathematics, assembly language, etc.? I'm looking at changing fields from mathematics and getting a CS degree since it interests me more.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Programming Quantum Computers - Classical Techniques Obsolete?
Loading...