# A What is quantum computing research mostly focused on?

Tags:
1. Nov 1, 2015

### Domenico94

What is quantum computing research mostly focused on? I mean, is it mostly about a physical point of view ( Like building better quantum transistor, or better quantum diodes, or, for example, using entalgment effect, to achieve better purposes), or is it mostly focused with quantum architectures (a bit like Von Neumann's and Alan turing's work in the 1940s) and algorithms?

2. Nov 1, 2015

### Strilanc

There's highly theoretical research about complexity classes. Scott Aaronson blogs about that kind of thing. For example, see his post about the proof that QMA(2) ⊆ EXP or his post about a better total query complexity separation between quantum computers and BPP.

There's algorithmic research that will probably be extremely useful in practice, at least at first, e.g. to reduce the number of qubits needed to do error correction or to simulate particular types of matter. And programming language research into what will be convenient when doing quantum programming. Needing 100 qubits instead of 200 qubits is a big deal when your goal is to double the number of qubits you can keep coherent each year and that's considered ambituous.

And of course there's engineering and development; trying to create physical realizations of qubits that are held stable enough, cheap enough, and controllable enough to be used as quantum computers.

I think the extremely fundamental "What are the ideas we need to make this work?" Alan Turing-ish and Von Neumann-ish work is mostly done or at least sufficiently done, though I don't really know enough about things to say that with confidence. Interestingly the Claude Shannon-ish work is NOT done yet. There's still a lot to do. But mostly I think practical application is held up by the engineering challenges of invoking a good qubit into being.

3. Nov 1, 2015

4. Nov 8, 2015

### mok-kong shen

Wiki on quantum tomography claims that the work of QPT increases exponentially with the number of qubits involved and hence is an infeasible task for sizeable number of qubits. I think that this clearly implies the infeasibility of practical quantum computing, since one needs to verify the correctness of the circuits of quantum computer hardware in its design, manufacture and maintenance.

5. Nov 8, 2015

### Strilanc

The computer you wrote that response on probably has more than 1 GiB of memory. That means your computer can be in $2^{8^{30}}$ states. I'd write out that number, but it has a hundred quadrillion digits. Your computer can be in more than $10^{26}$ states! That is insane, right? No one could ever have tested that many!

In fact, because the number of states is so ridiculously high, no one has checked that even 0.000000001% of the states work! Therefore, by the same logic you're using to dismiss quantum computers, computers like the one you wrote your response on must be failing 99.999999999% of the time and you in fact did not succeed in posting. Odds are the information revolution is nothing but a shared delusion. /s

In case you're not catching my drift, I'll be explicit: it's not necessary to check every possible state or state transition to get something working in practice. Engineering mistakes have tendency to break huge swaths of the state space, so that checking the first 0.0000000000000000000000001% of states eliminates the first 99.9% of errors. Furthermore, really subtle errors that only affect say 0.0000000000000000001% of the states often don't matter in practice because users never encounter them. Really tricky errors have to be rare enough that you can't quite test enough on your own to find them, but a million users can. There's a substantial number of such errors, but not enough of them that computers don't work well enough to get useful things done.

6. Nov 8, 2015

### mok-kong shen

@Strilanc: Even for classical hardware, there are constantly checks involved when your computer is running, only that you are not aware of them. One of them is the correct writing to disk storage when you save files. Because the classical computer hardware has been highly improved during the decades, you would seldom see messages of irrecoverable errors . (Nevertheless several times in the past my PC suddenly broke down such that it had to be rebooted. In the nineteen seventies, computer centres with their mainframes (PCs were not yet born) had to reserve a couple or more of hours each day for maintenance, during which no normal jobs could be run.) But quantum hardware is different from classical hardware because, if one in the tests checks the input qubits of a gate somewhere in a circuit, they are no longer what they were and additionally because the qubits have probabilities as their characteristics, in distinction to the classical bits. That very much complicates the testing in my view. If I don't err, one has to do a sufficiently large number of tests of a circuit, each time with input qubits satisfying certain specifications (that by itself, i.e. assuring the uniform quality of the sources of qubits, is not trivial, I am afraid) and testing the output qubits in order to determine whether their probability characteristics satisfy the desired specifications. (Being a layman, I have to admit that my current knowledge has not yet covered how one proceeds to concretely determine these probabilities, since they are in general complex-valued quantities and not real-valued quantities.) I believe that these facts underly the exponential increase of work of quantum process tomography with the number of qubits involved, as claimed in the Wiki article. (We need to ascertain the correct functioning of entire circuits and not simply that of the individual gates contained in them, hence the word "process" in its name, I suppose.)

Last edited: Nov 8, 2015
7. Nov 8, 2015

### Strilanc

If you think circuit verification is an exponential wall that will stop quantum computers from working, then you need to learn more about how we actually intend to make these things. The fact that experts are actually bothering to try to engineer quantum computers should be an indication that they don't think problems obvious to a layman are insurmountable. You should familiarize yourself with a problem and proposed solutions before pronouncing it insurmountabe.

If we were approaching the design problem by making random circuits and checking that they do what we want, then verification would in fact be impossibly difficult. But no one is going to use a design strategy that dumb. We're going to break the problem down into simple pieces, verify that those pieces work, prove that putting them together in particular way should work, then put them together, check that it worked, and iterate. Just like we did with every other complicated system ever.

The basics of making large quantum circuits out of simple pieces was figured out twenty years ago. We can use T, H, P, and CNOT gates to approximate any desired operation arbitrarily well. The gate error and approximation error at each step compound additively instead of exponentially. The cost of making the errors small enough, by using more gates in the approximation and for quantum error correction schemes, is only polylogarithmic in the size of the computation.

What's been holding us back isn't the size of the state space, it's getting the simple gates good enough.

Last edited: Nov 8, 2015
8. Nov 8, 2015

### mok-kong shen

@Strilanc: Of course I, as layman, has to accept/follow experts' opinons. But is there any reason for me as layman to doubt the words of the author of the Wiki article, who apparently is an expert, that QPT is an infeasible task for sizable number of qubits? My deduction from that information to the infeasiblity of quantum computing as such you may certainly question as to its logical validity, but I don't yet clearly see your concrete points in that aspect such that I could attempt to counter-argue eventually. (I did though give my personal arguments that the correctness of any entire circuit (dedicated for a specific function) of a quantum hardware should be verified as a single entity and that verification of its components alone is unlikely to be sufficient for that purpose.)

9. Nov 8, 2015

### Strilanc

Your belief that full QPT of every circuit is a necessary step for practical quantum computing is simply wrong. QPT is neither necessary nor sufficient for quantum computing to be practical. It's a separate thing.

10. Nov 9, 2015

### mok-kong shen

@Strilanc: I should appreciate some elaborations of your claim.

11. Nov 9, 2015

### Strilanc

If you want more than the high level details I've already given, you should buy a quantum computing text book. It's a lot of information to explain.

12. Nov 10, 2015

### mok-kong shen

@Strilanc: Actually I recently asked a physicist working in quantum computing research how one could know that, if a hardware quantum gate (not a more complex circuit!) is speicified to deliver output qubits of certain states, in case the input qubits are of certain other states, the hardware indeed satisfies the given specification. He referred me to quantum tomography. (The some 10 textbooks on quantum computing I had looked at as I posed that question to him didn't even mention quantum tomography.)

Last edited: Nov 10, 2015
13. Nov 10, 2015

### Strilanc

... How does that support your point?

I told you to read a quantum computing text book because they contain constructions for building complicated operations out of simple gates while guaranteeing bounds on error, not because they'll explain how to do testing.

I'm just trying to correct your mistaken belief that the only possible kind of testing is uninformed black-box brute-force integration testing, so useful testing is impossible, so engineering complicated systems is impossible.

14. Nov 10, 2015

### mok-kong shen

@Strilanc: Please kindly name a textbook, page no., and cite a few lines concerning the method of testing the states of qubits of the outputs of a quantum gate. Quantum error corrrection alone is not sufficient IMHO, unless you could cite something to the opposite. (In fact, consider a design that is wrong even in theory, i.e. it actually does something different from what the designer wants, there is no error in transmission etc. etc., how would you know that the hardware is not in accordance to its specification?)

Last edited: Nov 10, 2015
15. Nov 10, 2015

### Strilanc

We are clearly talking past each other instead of making progress, because my argument is that the difficulties you're worried about aren't difficulties. Text books don't address them, they go along totally different routes, so if I cited page numbers you'd complain that they were about approximating arbitrary operations with a gate set instead of about verification.

Anyways, here's a recent paper about measuring qubit error and scaling up to useful quantum computers. It has seven "technology levels", kind of like a roadmap. Process tomography is explicitly dumped at level 2:

Note that this isn't an argument against making it past level 2, it's an argument against using quantum process tomography. That's why there's 5 more levels, instead of "and therefore we give up".

16. Nov 11, 2015

### mok-kong shen

@Strilanc : In my view verification subsumes error correction. Error correction deals with unavoidable noises while verification asks whether at a certain test moment a circuit or gate as such functions exactly as specified. Problems may occur due to design mistakes (e.g. some infrequently occurring situations were not properly taken care of), faults in the manufacturing process (errors of workers or machines), material aging and, last but not least, wrong handling by the maintenance people. These are clearly not in the realm of error correction (cf. error correction codes used in classical computing). Thus, as I wrote earlier, quantum process tomography is necessary in combination with quantum error correction measures to achieve correct quantum computing in practice. (In other words, your mentioned level 2 is the least to be achieved for any practically useful quantum computing. If that's infeasible (according to Wiki), then further efforts in research would be waste of resources.)

Last edited: Nov 11, 2015
17. Nov 11, 2015

### Truecrimson

A full process tomography would tell you everything about the computation, some of which you don't need to know because not all circuits are created equal. For examples, some highly nonlocal gates probably can't even be realized in nature. But other than that, under some assumptions, there are ways to certify whether a full tomography is needed without doing the tomography. http://arxiv.org/abs/1104.4695, http://arxiv.org/abs/1104.3835, and subsequent works that cite these are examples.

18. Nov 11, 2015

### mok-kong shen

@Truecrimson: At the beginning of the paper, it says that the method is only applicable to pure states. Couldn't that be a quite severe limitation in the context of practical quantum computing in general?

19. Nov 11, 2015

### Truecrimson

That would be quite a severe restriction indeed. But there are other subsequent works. Things like blind quantum computing and bootstrapping uses a small or trusted quantum computer to verify that a larger or untrusted quantum computer works as advertised.

But do you agree with my first point (which is Strilanc's as well) that in reality you never have completely no knowledge of what is going on in your experiment? A full tomography would be required (even in the classical case) if you have zero prior information, but that's not the case. So the fact that tomography is infeasible does not imply that computation is infeasible.

20. Nov 12, 2015

### mok-kong shen

@Truecrimson: Nothing in the real world is perfect, so certain probability of errors and the corresponding risks of damages etc. have always to be accepted, depending on the particular situation one (in contrast to other people) is in and one's own viewpoints ("philosophy"). In other words, one's (free) decision may well be different from that of others. In the present context of (future) practical quantum computing I like to say that I personally wouldn't use the services of a quantum computer, of which it is known that in its design, manufacture and maintenance phases there is absolutely no (practically feasible) way provided to experimentally verify the correct functioning of an arbitrarily chosen component of the hardware in case of need or desire.

I doubt that I correctly understand your term "prior information" above. What is that? Is that, for example, a label on a piece of hardware that was put on it by its manufacturer saying that it is something for a certain purpose (and that manufacturer is very famous), or what?

Last edited: Nov 12, 2015