Is there a way to translate quantum bits in a traditional numerological system?

Click For Summary
SUMMARY

The discussion centers on the relationship between quantum bits (qubits) and traditional numeral systems, particularly whether qubits can be interpreted as elements of a ternary or higher base numeral system. Participants clarify that while qubits can exhibit superposition, they ultimately collapse to classical bits (0 or 1) upon measurement, thus limiting their utility as a base-3 digit for information storage. The conversation highlights the complexity of quantum computing, emphasizing that qubits operate on probabilities rather than fixed values, and that quantum computers are not expected to replace conventional computers in the near future, primarily serving specialized functions such as code breaking and advanced measurements.

PREREQUISITES
  • Understanding of quantum mechanics principles, particularly superposition and measurement.
  • Familiarity with classical computing concepts, including binary numeral systems.
  • Basic knowledge of quantum computing terminology, including qubits and quantum states.
  • Awareness of algorithms in quantum computing, such as Shor's algorithm.
NEXT STEPS
  • Explore the concept of quantum superposition and its implications for quantum computing.
  • Study Shor's algorithm and its applications in quantum factoring.
  • Investigate the differences between classical bits and qubits in terms of information processing.
  • Learn about the potential applications of quantum computing in cryptography and complex problem-solving.
USEFUL FOR

Researchers, quantum computing enthusiasts, and professionals in fields such as cryptography and data science who are looking to deepen their understanding of quantum bits and their implications for traditional computing systems.

Levi Porter
Messages
9
Reaction score
0
Are quantum bits just a form of a ternary numeral system?

If something can be 0, 1, or both simultaneously, isn't the superposition just another equal value?

If the superposition of 0 and 1 were literally individual, combined, and possibly something different simultaneously, then it seems that a quantum bit could be an operational function of a numeral system base of 3, 4 or 5, I would guess.

Is there a way to translate quantum bits in a traditional numerological system?
 
Computer science news on Phys.org
Individual q-bits are sphere-valued. If you choose an axis, then 0 and 1 are the north and south pole of the sphere.
 
Hurkyl said:
Individual q-bits are sphere-valued. If you choose an axis, then 0 and 1 are the north and south pole of the sphere.

Thanks for the response Hurkyl.

Does sphere-valued mean that information, pulses, or values occupy the entire volume of a sphere?

If the poles are just a predetermined and or adjustable locational points of reference including superposition, then it still seems that it could be correspondent to a numeral system?
 
Last edited:
The problem is that, unlike a classical bit, you cannot directly observe its value: measuring a qubit along some axis turns it into a classical bit. If the value was originally near the north pole, then the qubit is more likely to turn into a 1. If it was near the south pole, then it was more likely to turn into a 0.

(Incidentally, all orientations I have used are an arbitrary choice)

The art of quantum computing is to get your qubits to interact without measuring them while the computer is running -- you only want to measure them at the very end, at which point you are very likely to see the answer you wanted.


Of course, when writing your algorithm, you would design it to use the full set of values. But I have no idea how it would correspond to a numeral system.

Incidentally, I don't see how classical bits correspond to a numeral system either.
 
Last edited:
Hurkyl said:
Incidentally, I don't see how classical bits correspond to a numeral system either.

I could be misinterpreting...anyway, my understanding was that conventional bits are generated by electric pulses of either on or off, translated into or as a 1 or 0, and used as a code assigned to all digital information originating from a functionality of two possibilities referred to as the binary numeral system or base-2 number system.

At least this is my understanding from http://en.wikipedia.org/wiki/Binary_numeral_system
 
Levi Porter said:
Are quantum bits just a form of a ternary numeral system?

Not really, for one because upon measurement a qubit will always have the value 0 or 1. For example, if i have 4 bits then i can store 2^{4} states, and pass any of them to you and you'd be able to recognize which state i had passed you (0100, 0011, etc). On the other hand if i have 4 qubits, then any of the 3^{4} quantum states i send you will always collapse to one of 2^{4} states, so although we can interpret a qubit as a base-3 digit, we can't use it as base-3 digit for storing and passing information.

In quantum computation we also use more information for each qubit than can be stored ina base-3 digit. For example rather than having the three states:
0, 1, 01
In quantum computation you might have:
[x] 1, [y] 0
Where x^{2} gives the probability of the qubit being 1 and y^{2} gives the probability of it being 0.

Therefore a qubit can have any number of states defined by [x y], where x^{2} + y^{2} = 1.

In operation, a quantum machine isn't equivalent to a base-3 machine either (which is actually equivalent to a base-2 machine, as they both can be modeled by a turing machine). When a quantum computer with n qubits performs an operation, it's performing an operation on 2^5 states (the number of states that the system of 5 qubits is in a superposition of). An operation changes the probabilities of each possible state, the one with the highest probability being the most likely to be revealed upon measurement.

This is something that a system of conventional bits would need years to model for even some small number of qubits.

Scott Aaronson has in his blog a pretty good explanation of Peter Shor's quantum factoring algorithm using simple terms:
http://scottaaronson.com/blog/?p=208

I particularly like one of the explanations given in one of the comments:
http://scottaaronson.com/blog/?p=208#comment-10026
 
Thanks for the reply and links Job

-Job- said:
... so although we can interpret a qubit as a base-3 digit, we can't use it as base-3 digit for storing and passing information.

So is it safe to say that it is unlikely in the near future that quantum computers will replace conventional computers?

It seems like their usage will, or are primarily used for code breaking, encryption development, or for being able to more accurately measure the cosmos.

I still can't grasp in layman or non mathematical terminology how an algorithm can simultaneously look at a series of numbers hundreds or thousands of digits in length without having some sort of lineage, chronology, or individual identifiers to each digit.

Q-bit computers remind me of the Z Machine that is trying to build a sun on Earth so to speak. It can produce about 80 times the entire world's output of electricity for billionths of a second.

It seems the challenge with both is to keep them going and harness their results.
 
Last edited:

Similar threads

Replies
3
Views
6K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 30 ·
2
Replies
30
Views
3K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 35 ·
2
Replies
35
Views
6K
Replies
17
Views
5K
  • · Replies 25 ·
Replies
25
Views
5K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K