Chance of building practical quantum computers

Click For Summary

Discussion Overview

The discussion revolves around the feasibility of building practical quantum computers, particularly focusing on the challenges associated with quantum process tomography (QPT) and its implications for hardware verification, as well as the roles of quantum error correction and validation methods in quantum computing.

Discussion Character

  • Debate/contested
  • Technical explanation
  • Conceptual clarification

Main Points Raised

  • One participant cites that the exponential growth of experimental configurations required for quantum process tomography suggests a low chance of building moderately sized quantum computers, as verification is essential for design and maintenance.
  • Another participant references research from a specific institution that presents promising results in quantum computation, implying that advancements may counter the concerns raised about QPT.
  • A different viewpoint emphasizes that while the state space of a quantum computer is vast, practical validation can focus on the components and statistical testing rather than exhaustive state verification, suggesting a more feasible approach to ensuring functionality.
  • A participant raises questions about the role of quantum tomography in the practical realization of quantum computing and its relationship with quantum error correction, indicating a need for clarification on these concepts.

Areas of Agreement / Disagreement

Participants express differing views on the implications of quantum tomography for the feasibility of quantum computers, with some arguing it presents significant challenges while others highlight potential advancements and alternative validation strategies. The relationship between quantum error correction and quantum tomography remains unclear and is a point of inquiry.

Contextual Notes

There are unresolved assumptions regarding the scalability of quantum process tomography and its practical applications in large systems. The discussion also reflects varying perspectives on the importance of different verification methods in the context of quantum computing.

mok-kong shen
Messages
15
Reaction score
0
Wiki on quantum tomography says: "The number of experimental configurations (state preparations and measurements) required for quantum process tomography grows exponentially with the number of constituent particles of a system. Consequently, in general, QPT is an impossible task for large-scale systems." Doesn't this fairly clearly indicate that the chance of building a moderately sized quantum computer is extremely low from the very beginning, since verification of computer hardware is necessary in it's design, manufacture and maintenance?
 
Last edited:
Physics news on Phys.org
Check http://www.tnw.tudelft.nl/nl/over-faculteit/afdelingen/quantum-nanoscience/medewerkers/onderzoeksgroepen/quantum-transport/research/background-information/quantum-computation/[/URL] or [PLAIN]http://www.tudelft.nl/en/current/latest-news/article/detail/einsteins-ongelijk-delfts-experiment-beeindigt-80-jaar-oude-discussie/[/URL]

These guys show some nice results [URL]http://www.nature.com/nature/journal/vaop/ncurrent/full/nature15759.html[/URL] and even in the times [URL='http://www.nytimes.com/2015/10/22/science/quantum-theory-experiment-said-to-prove-spooky-interactions.html']here[/URL]
 
Last edited by a moderator:
My computer has 8 GiB of memory. The number of states it can be in is so large that it takes a billion digits to describe it. No one will ever ever have the time to validate that each of those states works. And yet it does work, mostly.

The trick is to not attack the state space as a black box. The state is made up of repeated pieces interacting in common ways. Failures tend to break huge swaths of the space, instead of just a single state. And even if a single state was failing somehow, it's probably hard for the user to hit that state.

The same thing applies to quantum computers. Validate the pieces. Do statistical tests on the whole. Rely on truly subtle problems being hard to hit in practice. Understand your error model and use it to guide testing. If users do find that an algorithm consistently triggers a problem, include that algorithm in your test suite. Be good enough instead of perfect.

John Martinis recently wrote a paper on basically this subject, though at a smaller scale: Qubit metrology for building a fault-tolerant quantum computer.
 
Last edited:
  • Like
Likes   Reactions: Heinera
If I don'r err, Martinis' paper doesn't mention quantum tomography. Hence my layman's questions: (1) Could quantum tomography play at least some role (i.e. even if it plays a comparatively smaller one than quantum error correction) in affecting the issue of the possibility of practical realization of quantum computing in the future (since the hardware needs diverse sorts of verifications)? (2) What is the relationship between quantum error correction and quantum tomography?
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 8 ·
Replies
8
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 204 ·
7
Replies
204
Views
13K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 18 ·
Replies
18
Views
3K
Replies
1
Views
3K
  • · Replies 45 ·
2
Replies
45
Views
13K
Replies
8
Views
5K
  • · Replies 62 ·
3
Replies
62
Views
10K