Popular accounts of quantum computing

In summary, the conversation revolves around the inaccuracy and oversimplification of mainstream media articles about quantum computing. The conversation also delves into the difficulty of explaining quantum computing to lay audiences and the potential blame falling on physicists for not accurately communicating the complex concepts. Suggestions for bridging the barrier of understanding between experts and non-experts are also discussed.
  • #1
VantagePoint72
821
34
I have never—and I really do mean never—seen an article in the mainstream press about quantum computing that didn't get it completely wrong. I don't just mean "over simplified"—physics is highly technical and ignoring important details for lay audiences is just a fact of life. I mean that they always say the same thing, and it's always simply, non-negotiably wrong. Take the following article published today: http://www.thestar.com/news/insight...y-grail-for-university-of-waterloo-scientists

Now, at one point there's an unnecessary, but largely inconsequential, confusing of the uncertainty and superposition principles—but that's not the issue. The offending piece is this:
It turns out that classical computers are not very good at factoring large numbers, a weakness that has long been exploited by cryptographers to safeguard data on the Internet. It is easy to multiply two prime numbers in order to produce a much larger number, but it turns out to be horrendously difficult to engineer the same process in reverse, to find the two prime divisors of a large number, a process called factoring.

The only way classical computers can address the challenge is by systematic trial and error — trying out two numbers to see if they work, discarding them, trying out two different numbers, and so on. There’s no shortcut.
...
By contrast, a quantum computer could crack such privacy barriers in an instant, by the dazzling expedient of testing every possible combination of divisors, not one by one, but all at once, something no conventional computer could do. The right answer would reveal itself almost immediately.

This is not an oversimplification, it's quite simply wrong. Shor's algorithm, and every other quantum algorithm, does not run in a single step. Yes, technically every possible input is operated on simultaneously—but at the cost of getting every possible output simultaneously! Extracting useful results out of a quantum computer is considerably less straightforward and uses the fact that the quantum amplitude is what obeys superposition but its norm-squared is what gives the probabilities. Maybe this is too difficult to explain non-technically, but that's no excuse for saying something objectively false in its place. And yet, every article I've ever seen about quantum computing contains this same misinformation. The result is that to the extent quantum computers are a part of the public imagination, their potential computing power is grossly over stated in people's minds.

I'm curious about the origin of this problem. Notice that in this article, the offending piece doesn't actually quote the researcher, Ray Laflamme. Still, I wonder if the journalist who wrote this got his awful description directly one of the other researchers he spoke to. I wouldn't be too surprised since I have, on some occasions, heard physicists—ones who should know better—saying things similar to what this article says. So, for those of you who work in quantum information: where do you think the blame falls? Do you think your colleagues generally give reasonably accurate explanation of quantum computing to lay audiences, but the message gets lost in translation? Or do they give into the temptation to give lazy, wrong answers because it's a difficult subject to explain? If it's the former, do you think there are steps the scientific community can take to remedy the problem? If the latter, is there any way to get the message to other physicists that this sort of thing is not OK?

Right now, there exists a serious barrier of understanding between those who study quantum information and everyone else. I think we need to try to understand the origin of this barrier; hopefully, doing so will help us take it down. Appropriately enough, this article gets one thing right when it says that "no matter how difficult they might be to fabricate, quantum computers are even more difficult to explain." But that's no excuse for not trying.
 
Physics news on Phys.org
  • #2
Describing physics that requires decades of learning/think, to someone that has none is always going to be difficult (read impossible) no matter how you explain it.

When I explain it (I work with quantum information too) I typically look at my target audience and then try to maximize their understanding at their level. Sometimes that requires me to use phrases such as "Keep in mind that the following is not 100% accurate, but to make it understandable consider the following scenario...". Of course, if they later retell what I said, they are going to omitt the "not 100% accurate" part for sure, because they don't know anything about this at all, and thus inaccuracies are born. I think this is just inherent in the fact that it's impossible to understand given so little time, and I think sometimes an inaccurate description, where they can at least understand something may serve the purpose better than an accurate one, which was not understood at all.
 

1. What is quantum computing?

Quantum computing is a type of computing that utilizes the principles of quantum mechanics to process information. It differs from classical computing in that it uses quantum bits, or qubits, which can exist in multiple states simultaneously. This allows for more complex and efficient calculations compared to classical computers.

2. How does quantum computing work?

Quantum computing works by manipulating the quantum state of qubits through operations such as superposition and entanglement. These operations allow for the qubits to perform calculations in parallel, resulting in faster and more efficient processing. However, quantum computers are still in early stages of development and face challenges in maintaining the quantum state and minimizing errors.

3. What are the potential applications of quantum computing?

Quantum computing has the potential to impact various industries such as pharmaceuticals, finance, and cybersecurity. It could accelerate drug discovery, optimize financial portfolios, and improve encryption methods. It could also be used for complex simulations and modeling, as well as in machine learning and artificial intelligence.

4. How is quantum computing different from traditional computing?

Quantum computing differs from traditional computing in several ways. Traditional computers use classical bits, which can only exist in a state of 0 or 1. Quantum computers, on the other hand, use qubits, which can exist in multiple states simultaneously. This allows for more parallel processing and can potentially solve certain problems that are intractable for classical computers.

5. When will quantum computers become mainstream?

It is difficult to predict when quantum computers will become mainstream as it depends on various factors such as technological advancements and investment. Some experts estimate that it may take another 5-10 years for quantum computers to be widely accessible and used for practical applications. However, quantum computing is a rapidly evolving field, so it is important to continue monitoring its progress.

Similar threads

  • Quantum Physics
Replies
8
Views
1K
Replies
4
Views
6K
Replies
8
Views
1K
  • Quantum Physics
Replies
7
Views
1K
  • Quantum Physics
Replies
1
Views
795
  • Quantum Physics
Replies
2
Views
1K
Replies
8
Views
910
  • Quantum Physics
Replies
2
Views
1K
  • Quantum Physics
Replies
1
Views
701
Replies
2
Views
1K
Back
Top