# Cantor's Diagonalization Proof of the uncountability of the real numbers

Human beings have very little and very wrong intuitions about infinity. When dealing with infinity, one should NEVER trust on intuition but rather on logical formal arguments.
I absolutely agree with that because infinity is ultimately nothing more than an abstract mathematical concept.

There's really no reason to believe that mathematical infinities have anything at all to do with physical reality or the real world. So why should those abstract notions be intuitive?

I personally have reasons to disagree with many foundational concepts of set theory that actually cause all of these abstract multiple sized infinities. As far as I'm concerned these are all based on flawed logic that has unfortunately already been accepted into mathematical formalism mainly due to ideas introduced by Georg Cantor himself.

I am totally in agreement with the words of Henri Poincare they go something like this, forgive me if the quote is not precise as I am recalling it from memory but it goes something like this,....

"Georg Cantor's transfinite set theory is a disease from with future mathematics will someday be cured" - Henri Poincare

I'm in total agreement with Henri Poincare and I personally believe that at some point humans will fix this mistake.

I honestly believe that if aliens came here from another planet they would laugh at our notion of multiple sized infinities. It's simply an unnecessary and unrealistic concept. It's actually caused by the formal introduction of an "empty set" into set theory (which I believe was also the work of Cantor). It's simply an unnecessary concept. It's not needed.

All of these weird ideas have sprung from the formalization of set theory around the turn of the 20th century. Mathematics did fine without them for some 2000 prior to that. Including Newton and Leibniz invention of calculus.

In the meantime I believe that Henri Poincare's comments about Cantor's transfinite sets will indeed someday be 'cured'. They simply aren't required and they are basically nonsense.

It's no wonder they don't make intuitive sense. They truly are nonsense.

pbuk
Gold Member
So do you believe that a number exists that is 10x the largest number that exists? Or even one more than the largest number that exists? You don't need set theory to see that this cannot be true, and neither can it be true that your rectangle has more rows than it does columns.

pbuk
Gold Member
the interedted reader following this thread (if there is somebody like this).
rofl

Note that mathematicians write $\aleph_0$ instead of $\infty$ in these cases.
I'm not even a mathematician and I knew that.

$\aleph_0$ is one of Cantor's transfinite cardinal numbers. And is basically used as a 'number' representing a completed infinite 'quantity'

Precisely what Henri Poincare warned Cantor not to do (i.e. not to think of infinity as a completed thing)

$\infty$ This symbol is more representative of an infinite process and is used a lot in calculus limits, etc. which are indeed processes.

Henri Poincare would be far more approving of infinity thought of as an endless process.

I may not be a mathematician, but I'm not exactly ignorant of math either.

But right now I'm going to bed. (ha ha)

We all heard such quotes before. Earlier they might have said:

"Negative numbers is a disease from with future mathematics will someday be cured"

or

"Imaginary numbers is a disease from with future mathematics will someday be cured"

or

"Noneuclidean geometry is a disease from with future mathematics will someday be cured"

or

"Decimal notation is a disease from with future mathematics will someday be cured"

These statements are all quite incorrect. Negative numbers, imaginary numbers, noneuclidean geomety and transfinite numbers are ALL very useful concepts. They will NOT go away.

Transfinite numbers yield a perfectly consistent theory (in so far that the natural numbers are consistent), and they form a theory that are very applicable. They show up literally everywhere in mathematics.

Cantor's ideas were initially despides and Cantor ended up in a deep depression. People like Kronecker and Poincarre were (although they were very smart) ultimately conservative and did not foresee the huge successes in axiomatic set theory.

I understand you don't like it. But you have to realize that it will not go away because mathematicians like it. Transfinite numbers might be unintuitive, but they eventually are entirely logically sound.

I notice a trend. Initially an innovative new idea is always laughed at, then it is furiously attecked and in the end it is accepted as common sense. (somebody else said this, but I can't remember who). Right now, transfinite numbers ARE a common sense. Mathematics will not be cured from axiomatic set theory, and if they would remove axiomatic set theory from mathematics then they would rip out its very heart.

pwsnafu
Henri Poincare would be far more approving of infinity thought of as an endless process.
In that case Henri Poincare should not work on Wall Street.

I am a mathematician (of sorts), but I am trying to present an argument to someone who isn't in terms that he seems to understand.
I have this argument over aleph not occasionally with my brother who is a Mathematician, and he and I both have a problem with the "Axiom of Choice" as he calls it.
Let me ask a simple question, If I were to show you a way to count every possibly decimal representation as a unique list -- and that means as many digits as can be imagined (inductively given n digits, I will list n+1 as well...) What would it do to this silly argument?

I realize many people think it is impossible; but that is part of the point.
I believe it can be done -- is it sufficient to put a big !"?"! into the whole idea of Cantor's idea?

(I am serious... and this isn't a difficult proof.)

I have this argument over aleph not occasionally with my brother who is a Mathematician, and he and I both have a problem with the "Axiom of Choice" as he calls it.
Let me ask a simple question, If I were to show you a way to count every possibly decimal representation as a unique list -- and that means as many digits as can be imagined (inductively given n digits, I will list n+1 as well...) What would it do to this silly argument?

I realize many people think it is impossible; but that is part of the point.
I believe it can be done -- is it sufficient to put a big !"?"! into the whole idea of Cantor's idea?

(I am serious... and this isn't a difficult proof.)
I really don't understand what you're trying to say. The axiom of choice has nothing, absolutely nothing to do with Cantor's diagonalization.

Deveno
Leucippus, let me offer you a variant of Cantor's proof, that show that your consideration of "slope" is irrelevant:

we proceed as before, except, for our "diagonal number" we do THIS:

if the first digit of the first number is 1, we assign the diagonal number the first digit 2.

otherwise, we assign the first digit of the diagonal number to be 1.

the next 8 digits of the diagonal number shall be 1, regardless.

if the 10th digit of the second number is 1, we assign the diagonal number the 10th digit 2.

otherwise, we assign the number 1 to the 10th digit of the diagonal number.

the next 89 digits of the diagonal number are assigned the digit 1.

if the 100th digit of the third number is 1, we assign the digit 2 to the 100th digit of our diagonal number.

otherwise, we assign 1 to the 100th digit of our diagonal number.

we continue in this way.

now the "slope" is "even worse" the diagonal is not even as steep as before. nevertheless, no matter which n we pick, the diagonal number differs from the first n numbers in our list, so it differs from every number on our list (it differs from the k-th number in the 10k-1-th digit).

@ micromass: i disagree with the contention that if a proof can only be demonstrated graphically, it isn't valid. for what is a formal system, if not a few squiggles on paper? is that not graphical? i don't know about you, but i read most of my mathematics.

more to the point: many proofs in algebraic topology or category theory amount to: diagram-chasing. the diagram IS the proof, in some cases. there's no logical reason why "pictures" can't be made every bit as unambiguous and consistent as well-formed formulas. constructing a length with a compass and straight-edge IS a proof of constructibility (a concept which can be expressed purely algebraically).

there is many people who think that a mathematical result "isn't true" until it is proven formalizable in (first- or second-) order logic in a formal language that has encoded the axioms of Zermelo-Frankel set theory. this is unfortunate.

pbuk
Gold Member
If I were to show you a way to count every possibly decimal representation as a unique list -- and that means as many digits as can be imagined (inductively given n digits, I will list n+1 as well...) What would it do to this silly argument?
Any countable list can be used as the start of Cantor's argument: simply construct a number whose nth digit is different from the nth number on the list for every n and you have a number that is not on the list.

This is kind of off topic but I'm curious why graphs can not be proofs (I have not taken a course on logic)?

more to the point: many proofs in algebraic topology or category theory amount to: diagram-chasing. the diagram IS the proof, in some cases.
A commutative diagram is not a proof. And many mistakes in category theory actually amount to misleading diagrams. A commutative diagram is a suggestion of a proof. It is merely an illustration.

If you look at Bourbaki, then they show that they treat commutative diagrams very rigorously. They try to prove everything with algebraic means.

To me, a proof is correct if it formilizable in some kind of logical system. You don't actually NEED to formalize it, but just giving a suggestion that it can be done is good enough.

Hurkyl
Staff Emeritus
Gold Member
Precisely what Henri Poincare warned Cantor not to do (i.e. not to think of infinity as a completed thing)
Ah joy, "completed infinity".

Just for your information, I have never encountered somebody who uses that terms that could give an even quasi-rigorous account of what the term means.

From observing people who use it, I quite honestly believe that it literally means "I am unwilling or unable to engage in one more layer of abstraction, and thus have failed to recognize that we're talking about the same idea." (I don't mean insult by this, except possibly in the 'unwilling' case)

This is kind of off topic but I'm curious why graphs can not be proofs (I have not taken a course on logic)?
Hmmm, I don't think I'm expressing myself clearly enough.

There are actually two kind of proofs: informal and formal proofs.

By the very definition, a proof is formal. A proof is defined as a sequence of logical algebraic steps. Each step follows from another step by applying an inference rule.

These things are the actual proofs. However, they are EXTREMELY hard to read. If all math books would consist out of formal proofs, then math books would be unreadable.

That's why we allow informal proofs. An informal proof presents people the basic ideas and reasoning of a proof. An informal proof suggests that the formal proof can be carried out if one would want to do so.

All text books contain informal proofs (and it's good that they do). Text books like to work with a lot of text and a lot of pictures to make it easier on the student. But one should always keep in mind that one should be able to formalize the proof.

And indeed, one can formalize all the proofs. An example is given by http://us.metamath.org/index.html which contains thousands of proofs including the celebrated Hahn-Banach theorem. This site shows what actual formal proofs look like. And as one can see, it is completely unreadable. (the site also shows formal proofs in plane geometry, including constructibility).

I am NOT advocating that everybody should be using formal proofs in journals or education. Quite the contrary. But the students should be well-aware that the formal proof can be carried it if one wants to do it.

Hmmm, I don't think I'm expressing myself clearly enough.

There are actually two kind of proofs: informal and formal proofs.

By the very definition, a proof is formal. A proof is defined as a sequence of logical algebraic steps. Each step follows from another step by applying an inference rule.

These things are the actual proofs. However, they are EXTREMELY hard to read. If all math books would consist out of formal proofs, then math books would be unreadable.

That's why we allow informal proofs. An informal proof presents people the basic ideas and reasoning of a proof. An informal proof suggests that the formal proof can be carried out if one would want to do so.

All text books contain informal proofs (and it's good that they do). Text books like to work with a lot of text and a lot of pictures to make it easier on the student. But one should always keep in mind that one should be able to formalize the proof.

And indeed, one can formalize all the proofs. An example is given by http://us.metamath.org/index.html which contains thousands of proofs including the celebrated Hahn-Banach theorem. This site shows what actual formal proofs look like. And as one can see, it is completely unreadable. (the site also shows formal proofs in plane geometry, including constructibility).

I am NOT advocating that everybody should be using formal proofs in journals or education. Quite the contrary. But the students should be well-aware that the formal proof can be carried it if one wants to do it.
I better see what you're getting at now. Thank you for clarifying.

I better see what you're getting at now. Thank you for clarifying.
Just to clarify it some more: if I see a graph somewhere which intuitively shows me something, then I will accept this as proof. Any mathematical journal (except logic-journals) will accept it as a valid proof. If you write it on the exam, then the professor will accept it as proof (except in a logic class). But it's not a formal proof in the strict sense of the word.

Hurkyl
Staff Emeritus
Gold Member
mm: I think the point Deveno makes (and maybe that Jorriss is wondering about) is that proof theory can be cast in settings other than string processing.

For example, arithmetic expressions and equations in category theory are often presented in graphs.

In Categories, Allegories, Freyd introduces the idea of a Q-sequence (and a Q-tree) that consists of a path through a category and a sequence of quantifiers, which can replace the role of formulas.

Applied to the category of finitely-presented categories, he uses them to prove a metatheorem that an equivalence functor preserves and reflects a property if and only if it can be expressed on a blackboard.

I really don't understand what you're trying to say. The axiom of choice has nothing, absolutely nothing to do with Cantor's diagonalization.
IN my discussions of Aleph Not with my brother, the Axiom of choice does come up.
But, set that aside -- what I am saying is that I believe I can show an algorithm which will demonstratably count every possible decimal number in the decimal notation system, uniquely, and once.

eg: 1.01 will be counted exactly once, as will 2.01, 2.011, 2,0000001, etc.
I am not trained in formal mathematical proofs with respect to Aleph not, etc, so excuse my vagueness.
However, my understanding is that the Real numbers are representable by the decimal notation system; and that the "Ideal" representation of all real numbers is spanned by a mantissa and fraction portion that becomes more and more perfect as the number of digits in each is allowed to grow arbitrarily large.

If I were to demonstrate an algorithm which showed that all possible decimal numbers can be counted with a distinct index; say 'i' so that any imaginable real number would exist "somewhere" in a sequence defined by an index (i) which is an integer from 0 to as large as one likes. eg: every possible decimal number could be found by a static algorithm called f(i).

The only issue which would come up is the representation of 0.99999999.... vs. 1.00000.... and whether or not they are distinct numbers as i becomes arbitrarily large.

If I am following the argument correctly, Cantor appears to be claiming that such a unique counting is impossible. So if I gave a simple counter-example in decimal notation -- would such an algorithm be of use in this discussion?

Hurkyl
Staff Emeritus
Gold Member
eg: 1.01 will be counted exactly once, as will 2.01, 2.011, 2,0000001, etc.
If I'm understanding what you describe, then you truly do generate a list containing every finite decimal, no matter how long. The set of terminating decimals is, indeed, countable.

The problem is that most real numbers don't have finite decimal representations. It sounds like you have a specific algorithm in mind; can you write down the first 10 or so numbers? After that, can you tell me at what place the number 1/9 appears?

If I'm understanding what you describe, then you truly do generate a list containing every finite decimal, no matter how long. The set of terminating decimals is, indeed, countable.

The problem is that most real numbers don't have finite decimal representations. It sounds like you have a specific algorithm in mind; can you write down the first 10 or so numbers? After that, can you tell me at what place the number 1/9 appears?
Yes, and Yes, although your second question can't be written down inside a text terminal of this size. Representation of an integer is not the same as existence of that same integer.
There are many counting numbers which we know exist, but can't be written down inside a text terminal of this size.

You would agree, no doubt, that "0.999999999....." is a representation of a real number; and by the ellipsis I am suggesting a pattern for something which is much larger than I would like to write down, although I DO KNOW HOW to write it down -- eg: and that the representation improves in accuracy and may have a selectable accuracy for a given discussion or purpose.

The algorithm itself can be adapted for various constraints; eg: there are many such algorithms based on a similar methodology.

DO you care if I count only the positive reals, or do I need to include the negative as well?
etc.

Deveno
Yes, and Yes, although your second question can't be written down inside a text terminal of this size. Representation of an integer is not the same as existence of that same integer.
There are many counting numbers which we know exist, but can't be written down inside a text terminal of this size.

You would agree, no doubt, that "0.999999999....." is a representation of a real number; and by the ellipsis I am suggesting a pattern for something which is much larger than I would like to write down, although I DO KNOW HOW to write it down -- eg: and that the representation improves in accuracy and may have a selectable accuracy for a given discussion or purpose.

The algorithm itself can be adapted for various constraints; eg: there are many such algorithms based on a similar methodology.

DO you care if I count only the positive reals, or do I need to include the negative as well?
etc.
if you give a recursive enumeration of every real between 0 and 1, that should sufice. such an algorithm, if you indeed possess it and it does what you claim, it would be of great interest to many people.

but...it is highly likely that there is an undiscovered flaw in your algorithm, which might not easily be detectable (especially if the algorithm is of sufficient complexity).

A commutative diagram is not a proof. And many mistakes in category theory actually amount to misleading diagrams. A commutative diagram is a suggestion of a proof. It is merely an illustration.

If you look at Bourbaki, then they show that they treat commutative diagrams very rigorously. They try to prove everything with algebraic means.

To me, a proof is correct if it formilizable in some kind of logical system. You don't actually NEED to formalize it, but just giving a suggestion that it can be done is good enough.
commutative diagrams COULD be proofs. for example p→q is just a small diagram. yes, you have a "different kind of formalizing", but logical need not be lingual. the common usage of diagrams now IS informal, this is true, but formal systems can be created which aren't string manipulation. an example can be found in:

https://www.amazon.com/dp/0963989901/?tag=pfamazon01-20&tag=pfamazon01-20

until relatively recently, geometric proofs ("pictures" a la Euclid), were the gold standard of "formal proofs". it's likely that as we accumulate more knowledge, better formal systems using more sophisticated building blocks will emerge.

(by the way, metamath is an awesome site).

Ah joy, "completed infinity".

Just for your information, I have never encountered somebody who uses that terms that could give an even quasi-rigorous account of what the term means.

From observing people who use it, I quite honestly believe that it literally means "I am unwilling or unable to engage in one more layer of abstraction, and thus have failed to recognize that we're talking about the same idea." (I don't mean insult by this, except possibly in the 'unwilling' case)
why, the uniquely defined impossibly inaccessible cardinal, of course :P

Last edited:
if you give a recursive enumeration of every real between 0 and 1, that should sufice. such an algorithm, if you indeed possess it and it does what you claim, it would be of great interest to many people.

but...it is highly likely that there is an undiscovered flaw in your algorithm, which might not easily be detectable (especially if the algorithm is of sufficient complexity).
It is a simple algorithm. Don't underestimate my abilities. :)

I will tailor it to the needs of the discussion. It is probably easier to do an algorithm that counts all possible reals than one limited to [0-1).

Let me ask a question to make sure we are agreeing on a general idea:
I am going to choose a non-repeating decimal to make sure you don't think I am over-simplifying.

Take the number (2)**0.5. The representation of this real number is given to you by a counting number (2) and a function ()**0.5. I would argue that we know such a number is not finite in representation because there is no FINITE sequence of digits which repeats.
eg: the first digits in a 4 mantissa and 4 fractional representation would be: 0001.4142
By induction, the next symmetrical word size up would be:
00001.41428 ... etc.

It is not a rational number that it should be represented by any ratio of counting numbers.

The proof of the infinite nature of the decimal representation is *inductive* in that no matter how many digits I show you, you can claim there is a representation with n+1 digits.

It is not necessary to write all the digits down to show that it is infinite, but simply to show that it grows without bound. But, none the less, the only way to express this number is a mapping through a function (sqrt) of a counting number. The sequence of digits, however, 1.41428.... is unique.

I am going to assume that a sequence that preserves previously made digits and adds new ones in a predictable (inductive) pattern is sufficient to show the signature of a unique real number; and that likewise, a counting number may be converged to by a signature (in the reverse direction). Such a pattern, that one may work out to any desired number of digits (without END and therfore infinite) is sufficient to show the existence in the decimal system of such a number. Eg: The sequence 1.41428.... no matter what algorithm produces it, so long as it matches the sequence of digits with not a single exception to the algorithm of sqrt(2) ... must be considered to have produced the same rational number.

I let you ponder that for a bit, because I want to give you the best answer I can.
(This is starting to get fun.... I wish people would help me with my simple statistics and propagation of error problems IF I complete this successfully ..... deal? )

And now the question, is the signature of the rational number inductively matched to all possible digits sufficient to indicate that it has been counted? (I am not saying we actually match all infinite digits, but that we show they *WILL* match.)

Last edited:
Hurkyl
Staff Emeritus
Gold Member
I don't have time to write more, so I'll make a series of brief comments.

Saying "the list X contains 0.1, 0.11, 0.111, 0.1111, and every other (terminating) decimal of that form" is very different from saying "the list X contains 1/9". In fact, the two statements have absolutely nothing to do with one another.

A representation cannot become "more accurate". "0.9999..." is notation for a specific decimal: the decimal with a 9 in every position to the right of the decimal point (and that decimal, in turn, can be interpreted as notation for the real number 1).

When I say every position, I really mean every. The decimal 0.999 has a zero in its millionth's place, for example. 0.999... has a 9 in that place. Pick positive integer x you like: 0.999... has a 9 in the position x places to the right of the decimal.

0.9999 is an approximation to 0.999.... 0.999999 is another (better) approximation to 0.999.... An algorithm that sequentially considers the numbers 0.9, 0.99, 0.999, 0.9999, and so forth does not produce a "representation that changes to become more accurate". Instead, it produces an infinite sequence of different representations of approximations to 1, each one more accurate than the last.

An algorithm that behaves in the aforementioned way, incidentally, is in of itself an exact representation of 1. (where the notation is that we interpret a program that outputs a convergent sequence as notating the limit of that sequence)

Unless you're designing the algorithm specifically for that purpose (or a similar one), it is rather difficult to devise an algorithm whose first time outputting a 'simple' number (like 1/9) happens after so many iterations that you cannot write it down -- even if we insist that you have to write it in decimal form.

The number of digits in the decimal representation of $\sqrt{2}$ doesn't "grow without bound" -- it's just a number, it doesn't make sense to "grow". The number is $\aleph_0$.

Rational numbers (even integers) can be limits of sequences of terminating decimals whose number of non-zero digits grows without bound. (e.g. the sequence 0.1, 0.11, 0.111, ...) (e.g. the sequence 0.1, 0.011, 0.00111, 0.0001111, 0.000011111, ...)

pbuk
Gold Member
Yes andrewr you can add √2 to the countably many rational numbers and you will still have countably many numbers. By extension you can add all the quadratic irrationals and you will still have countably many, in fact you can add all the roots of polynomial equations and you will have all the algebraic numbers in your list, which is countable.

Are you satisfied that you can create any number possible and put it on the list? So therefore there does not exist any number that is not on the list? Good, because that is exactly the hypothesis that starts Cantor's proof - that all real numbers can be written down in a list such that each real number can be mapped to an integer (its place on the list). Cantor's diagonal argument constructs a number that can plainly be seen not to be on the list: if you pick any number in the list in position n, the constructed number is different from the number on the list at least in it's nth decimal digit. The hypothesis was clearly wrong.

Far from contradicting Cantor's proof, you have simply demonstrated that you can produce a list that is the starting point for that proof.

Far from contradicting Cantor's proof, you have simply demonstrated that you can produce a list that is the starting point for that proof.
No. There is no "starting point" for the proof.

Cantor asks us to consider a list of ALL the real numbers. He then constructs a real number not on that list, showing that ANY such list is necessarily incomplete. Therefore there can be no list of all real numbers.

Cantors proof has NOTHING to do with taking some countable list and trying to add numbers to it. That's a misunderstanding that's a source of much confusion.