Programming Quantum Computers - Classical Techniques Obsolete?

In summary: That makes sense. I didn't think about the distinction between state and flow. Would an analogous way to think about this be sort of like sets (states) and functions (flow)? Obviously, those are not the same things, but I'm trying to think... that's a really good point. In summary, computer science will change because quantum computers will allow for a syntax similar to those of procedural languages today. However, the high-level programmer will need to focus on flow control in order to take advantage of quantum computing.
  • #1
kings7
37
0
This question has been bugging me. I have a math degree, and my computer knowledge is limited to VERY BASIC programming and being able to build my own PC, so I thought this would be a good place to ask.


Note: This question has no "clean cut" forum to fit into. I read ALL the forum descriptions and decided this was the best. Please move if necessary.
__________________________________________________

I know quantum computing is in its infancy (or for all practical purposes completely out of reach). But, assuming that one day we'll have mass-produced quantum computers, does anyone know how computer science will change? Will all the data structure, computer architecture, programming language, etc. classes be a waste of time and will CS majors have to start over? Specifically, how will this change the high-level programmer? Will quantum programs be able to have a similar syntax to major ones today like C, C++, or Java?

Hypotheses from knowledgeable people in software programming or computer architecture/engineering is especially appreciated! Thanks in advance!
 
Technology news on Phys.org
  • #2
kings7 said:
This question has been bugging me. I have a math degree, and my computer knowledge is limited to VERY BASIC programming and being able to build my own PC, so I thought this would be a good place to ask.


Note: This question has no "clean cut" forum to fit into. I read ALL the forum descriptions and decided this was the best. Please move if necessary.
__________________________________________________

I know quantum computing is in its infancy (or for all practical purposes completely out of reach). But, assuming that one day we'll have mass-produced quantum computers, does anyone know how computer science will change? Will all the data structure, computer architecture, programming language, etc. classes be a waste of time and will CS majors have to start over? Specifically, how will this change the high-level programmer? Will quantum programs be able to have a similar syntax to major ones today like C, C++, or Java?

Hypotheses from knowledgeable people in software programming or computer architecture/engineering is especially appreciated! Thanks in advance!

With any form of computation, the two key ingredients are state and flow control.

Most algorithms have very clear guidelines about how state fits into a particular algorithm.

The flow control however is the big thing.

When you think of procedural (I call it linear), then yes you have that method. But there are a group of languages called "non-procedural languages" which have a syntax that admits a different way of doing computation (or at least thinking about it).

So naturally the big thing is to focus on how to organize the flow control of the actual computation and represent that accordingly.

In conjunction with this, develop a language that is strikes a balance between ease of use and the ability to do what it was designed to do.

So I guess in saying this, if useable platforms do become available, there will have to be frameworks that are designed with the flow control in mind.

In terms of state though, I can't see one getting away from the idea of discrete data though, simply due to how computation works with regard to the requirement that data needs to be clearly defined with respect to other data, and this leads to having decisions which are "sharp" (think a Heaviside function) and this leads to states which are atomic (ie discrete).
 
  • #3
chiro said:
With any form of computation, the two key ingredients are state and flow control.

Most algorithms have very clear guidelines about how state fits into a particular algorithm.

The flow control however is the big thing.

When you think of procedural (I call it linear), then yes you have that method. But there are a group of languages called "non-procedural languages" which have a syntax that admits a different way of doing computation (or at least thinking about it).

So naturally the big thing is to focus on how to organize the flow control of the actual computation and represent that accordingly.

In conjunction with this, develop a language that is strikes a balance between ease of use and the ability to do what it was designed to do.

So I guess in saying this, if useable platforms do become available, there will have to be frameworks that are designed with the flow control in mind.

In terms of state though, I can't see one getting away from the idea of discrete data though, simply due to how computation works with regard to the requirement that data needs to be clearly defined with respect to other data, and this leads to having decisions which are "sharp" (think a Heaviside function) and this leads to states which are atomic (ie discrete).

That makes sense. I didn't think about the distinction between state and flow. Would an analogous way to think about this be sort of like sets (states) and functions (flow)? Obviously, those are not the same things, but I'm trying to think of this in terms of combinatorics.

I don't fully understand why computation must be discrete, but it definitely makes sense on an intuitive level. I wasn't familiar with the Heaviside step function, but I know now a little about it after reading some. Would it also be correct in thinking that this function represents sort of a transition between the discrete and the continuous?

Thanks for your reply btw!
 
  • #4
kings7 said:
That makes sense. I didn't think about the distinction between state and flow. Would an analogous way to think about this be sort of like sets (states) and functions (flow)? Obviously, those are not the same things, but I'm trying to think of this in terms of combinatorics.

I don't fully understand why computation must be discrete, but it definitely makes sense on an intuitive level. I wasn't familiar with the Heaviside step function, but I know now a little about it after reading some. Would it also be correct in thinking that this function represents sort of a transition between the discrete and the continuous?

Thanks for your reply btw!

For the discrete argument, think about the main operators in computation: you have arithmetic, logical, and things like comparison operators.

Now imagine having non-discrete "registers". In other words the size of the register is infinite since it has the capacity to represent a continuous variable.

Clearly you can see that you will run into problems. If you apply simple operations like addition, subtract or multiplication, the computation will for the general case never finish.

The above case assumes the general case, meaning that registers have to represent every possible case and not just specific cases like whole numbers, or surds, but any number.

So for computation to be feasible you need to have instructions that can be completed in a finite amount of time, and for this to be the case the data (or registers) of the machine have to be discrete or countable.

With regard to your statement about functions, that is a good way to think about it. The functions however are a function of your complete state space in the general case, but of course for examination of small computations you can reduce this state space to something manageable.

So in saying that you could think of a function that relates every state to each other. Your time variable would be linked to both the registers and memory of the machine.

If you are interested in a mathematical view of converting programs to mathematical statements Gregory Chaitin has written a good book about converting LISP programs to statements in number theoretic terms. The equations are enormous, but it should give you an idea of how a conversion actually works and is calculated.

Basically the idea of Chaitin is that you are running a program to compute something and the possible solutions of that something can be find by solving a massive Diophantine equation.

With regards to the combinatorics, I don't complete understand what you mean so I can't comment about that.
 
  • #5
chiro said:
For the discrete argument, think about the main operators in computation: you have arithmetic, logical, and things like comparison operators.

Now imagine having non-discrete "registers". In other words the size of the register is infinite since it has the capacity to represent a continuous variable.

Clearly you can see that you will run into problems. If you apply simple operations like addition, subtract or multiplication, the computation will for the general case never finish.

The above case assumes the general case, meaning that registers have to represent every possible case and not just specific cases like whole numbers, or surds, but any number.

So for computation to be feasible you need to have instructions that can be completed in a finite amount of time, and for this to be the case the data (or registers) of the machine have to be discrete or countable.

With regard to your statement about functions, that is a good way to think about it. The functions however are a function of your complete state space in the general case, but of course for examination of small computations you can reduce this state space to something manageable.

So in saying that you could think of a function that relates every state to each other. Your time variable would be linked to both the registers and memory of the machine.

If you are interested in a mathematical view of converting programs to mathematical statements Gregory Chaitin has written a good book about converting LISP programs to statements in number theoretic terms. The equations are enormous, but it should give you an idea of how a conversion actually works and is calculated.

Basically the idea of Chaitin is that you are running a program to compute something and the possible solutions of that something can be find by solving a massive Diophantine equation.

With regards to the combinatorics, I don't complete understand what you mean so I can't comment about that.

I kind of feel stupid after you explained the discrete thing. The explanation makes it obvious, thank you.

I might check that out. I'm not daunted by big equations. Anything explained in more mathematical terms is easier on my brain.

By combinatorics I just meant trying to interpret computation in a discrete/graph theoretical manner which would include mostly sets and functions (and to a limited extent operators and certain other categories). But you explained that nicely, so don't worry.

Much thanks again! I'm glad to have some reference points to go off of now when thinking about these things.
 
  • #6
kings7 said:
I know quantum computing is in its infancy (or for all practical purposes completely out of reach). But, assuming that one day we'll have mass-produced quantum computers, does anyone know how computer science will change? Will all the data structure, computer architecture, programming language, etc. classes be a waste of time and will CS majors have to start over? Specifically, how will this change the high-level programmer? Will quantum programs be able to have a similar syntax to major ones today like C, C++, or Java?

Quantum Computers are entirely different from conventional computers and are great at solving specific problems, but are either impractical or don't offer substantial gains, in terms of performance, for the types of problems solved by today's computers.

In Computer Science, the class BQP is associated with the problems that a Quantum Computer is able to solve quickly. The "P" in BQP stands for Polynomial time (with respect to the size of the input to the problem) which is informally synonymous with "fast". Any instance of a problem in BQP may be readily solved by a QC, even for large instances.

Another class, P, is analogous to BQP but for conventional computers - that is, it contains the problems that today's computers are able to solve efficiently. Many frequently occurring problems live in P, e.g. sorting, searching a list, matrix multiplication, computing the greatest common divisor and even testing for primality. In other words, many of the tasks performed by today's popular software applications are already efficiently solved by laptops and mobile devices - in some cases they're already as fast we need them to be because they act on data sets of almost insignificant size.

For large data sets, we're able to cluster computers or CPUs to improve performance. For example in Parallel Computing, sorting [itex]n[/itex] numbers with [itex]log(n)[/itex] CPUs is done in [itex]O(n)[/itex] (linear) time, as opposed to the [itex]n \times log(n)[/itex] time it would take for a single CPU. It's faster, but has the additional cost associated with the extra CPUs. Additionally some problems are not efficiently parallelizable - in particular problems that interact with a user or other devices. This is one reason why a working QC won't replace our existing CPUs, there's no inherent gain (with some possible exceptions such as games and math or data analysis software running with ambitious data sets).

One problem that Quantum Computers are substantially better at solving is Integer Factorization - given a number, output its prime decomposition. This problem is in BQP and outside P, meaning that conventional computers require exponential time to solve it (with respect to the number of bits in the number), whereas QCs are able to solve it in [itex]b^3[/itex] - where b is the number of bits - much faster by comparison. As an example, a large number with hundreds of digits on a QC can be factored in roughly a few million operations, or instantly as perceived by a person. In a conventional computer, factoring the same number requires a number of operations possibly exceeding the estimated number of atoms in the universe.

This dramatic difference in performance is only visible for very large numbers. Since computer software doesn't routinely have to deal with such large quantities there's no pressing need to switch to Quantum Computers. The more significant impact of QCs would be in cryptography since RSA, a popular algorithm used in public key encryption is vulnerable to fast integer factorization.

As useful as Quantum Computers would be they're not proven to offer any advantage in solving a class of difficult and frequently occurring problems (called NP-Complete problems). As an example, even a QC would take forever to solve a very large Sudoku - it's curious that lots of games are particularly difficult theoretical problems, GO for instance is completely intractable.

Quantum Computers are probabilistic (rather than deterministic) machines so there's a very small probability that they'll output the wrong answer, but by using repeated runs we can reduce the probability of error to that of corruption or hardware failure, so it's not very significant in practice. There's a computational class, called BPP (bounded error probabilistic polynomial time) which contains the problems that can be efficiently solved by randomized algorithms running on conventional computers, and some offer improved performance with negligible risk of error.

There are some QC simulators available which use specific notation, more comparable to assembly than Java/C given the nature of QCs.

How do you program for Quantum Computers? Suppose you have a particle or quantum bit, with some measurable property with value 0 or 1. Until the value is measured there is some probability of the value being 0 and some probability of the value being 1. If you have n such qubits, you have a probability distribution that indicates the probability of each possible state. If there are n bits then there are [itex]2^n[/itex] states, each possibly with a non-zero probability depending on how the bits arrived at that state.

The fact that the "Universe" is, or seems to be, tracking the probability of each possible outcome is what Quantum Computers exploit. If we were living inside a computer and that computer were simultaneously tracking every possible outcome in our day-to-day lives, we would be able to exploit this for computation by determining how our actions affect the possible number of outcomes and corresponding probabilities. With each action the universe would have to update this large set of probabilities that it's tracking. Each such update resembles a mathematical operation, an operation that's performed against a set of [itex]2^n[/itex] numbers.

The idea is to take an initial state of n qubits and apply successive operations to get into a state whose probability distribution matches a useful state (in integer factorization for example, the set of numbers with a given period). Finally we perform additional operations that cause the desired solution to be the most likely state, such that when we do measure the final state it yields the correct solution with high probability. A QC simulator allows you to analyze the probabilities for each state and apply operations corresponding to Quantum gates, so in the end it resembles assembly language more than anything.
 
  • #7
In short, using a Quantum Computer for conventional purposes is like using a nuclear reactor to power a mobile device. It's much more useful in isolated and specific contexts.
 
  • #8
-Job- said:
cont...In short, using a Quantum Computer for conventional purposes is like using a nuclear reactor to power a mobile device. It's much more useful in isolated and specific contexts.

Thank you for that! That answered a lot of lingering questions I still had.

So, if you were just beginning studies in Computer Science, in order to "future-proof" your knowledge base, would you concentrate on the more theoretical aspects of the field such as computational complexity theory, discrete mathematics, assembly language, etc.? I'm looking at changing fields from mathematics and getting a CS degree since it interests me more.
 

1. What is the difference between classical and quantum computing?

Classical computing relies on bits, which can only have a value of 0 or 1, to perform operations. Quantum computing uses quantum bits, or qubits, which can exist in multiple states at once and can therefore perform operations much faster and more efficiently.

2. How are classical techniques becoming obsolete in the field of quantum computing?

Classical techniques, such as algorithms and programming languages, are designed for classical computers and do not take advantage of the unique properties of quantum computing. As quantum computers become more powerful and accessible, these classical techniques are no longer able to keep up with the demands of quantum computing.

3. Can quantum computers solve problems that classical computers cannot?

Yes, quantum computers have the potential to solve certain problems that are currently intractable for classical computers. This is because quantum computers can utilize properties such as superposition and entanglement to perform operations that would take a classical computer an impractically long time.

4. Are there any real-world applications for quantum computing?

Yes, there are many potential applications for quantum computing in fields such as cryptography, drug discovery, financial modeling, and optimization problems. However, it is still a developing technology and many practical applications are still being researched and developed.

5. Do I need to be a quantum physicist to program quantum computers?

No, while a background in quantum physics can certainly be helpful, there are now programming languages and tools specifically designed for quantum computing that do not require advanced knowledge of physics. However, a strong understanding of mathematics and computer science is still necessary to effectively program quantum computers.

Similar threads

  • Programming and Computer Science
Replies
17
Views
2K
  • Programming and Computer Science
Replies
29
Views
3K
  • Programming and Computer Science
Replies
11
Views
1K
  • Programming and Computer Science
Replies
7
Views
653
  • Sticky
  • Programming and Computer Science
Replies
13
Views
4K
  • STEM Career Guidance
Replies
11
Views
710
  • STEM Academic Advising
Replies
9
Views
1K
Replies
1
Views
812
  • Programming and Computer Science
Replies
8
Views
875
Back
Top