Does Infinity Times Two Equal Infinity and Other Mathematical Paradoxes?

  • #1
docnet
Gold Member
696
348
TL;DR Summary
Confusion about infinity
We have come to accept that Infinity times two is infinity. In the sense of 'size' we use to think about everyday numbers, the rules of arithmetic with infinities seem like nonsense. For example, consider the computable number

$$0.100100100100100....$$

In the decimal expansion, there are clearly twice as many zeros as there are ones. In fact, for any finite ##n##, if we partition the decimal expansion into strings of ##3n## numerals, then each string will contain ##100\%## more zeros than ones. However, as ##n## reaches infinity, we learned that ones and zeros both appear ##\aleph## times, implying equality. This, to me, seems like falling short of a precise definition on what's called 'infinity'.

I guess I am not satisfied the mysterious reality that math doesn't have answers to such puzzling contradictions, where our intuitions break down. This must mean the language we use to express mathematics, and mathematics itself, ultimately supersedes our 'intuition', and must be accepted as such without question?

edit: changed ##50\%## to ##100\%##.
 
Last edited:
  • Like
Likes Greg Bernhardt
Mathematics news on Phys.org
  • #2
A very good question. But there has been a lot of good, rigorous work done on that subject. Are you familiar with Cantor's work and the definitions of equal cardinality and the larger cardinal numbers? See this
 
  • Like
Likes docnet
  • #3
docnet said:
TL;DR Summary: Confusion about infinity

We have come to accept that Infinity times two is infinity.
Technically, infinity is not an integer or a real number. Addition and multiplication using infinity are not defined. In order to define these, you would have to extend your definition of addition and multiplication.
docnet said:
In the sense of 'size' we use to think about everyday numbers, the rules of arithmetic with infinities seem like nonsense.
Not nonsense, but everyday arithmetic does not apply to infinities.
docnet said:
For example, consider the computable number

$$0.100100100100100....$$

In the decimal expansion, there are clearly twice as many zeros as there are ones.
That's not clear at all. What's clear is that in any finite decimal expansion there are twice as many 1's as 0's. Saying there are twice as many in the infinite expansion is meaningless - unless you first define what twice as many means.
docnet said:
This, to me, seems like falling short of a precise definition on what's called 'infinity'.
Your woolly thinking falls short of precision. Your woolly thinking isn't modern mathematics or anything like it.
docnet said:
I guess I am not satisfied the mysterious reality that math doesn't have answers to such puzzling contradictions,
It's you who does not have answers. The deficiency is not in mathematics itself.
docnet said:
where our intuitions break down. This must mean the language we use to express mathematics, and mathematics itself, ultimately supersedes our 'intuition', and must be accepted as such without question?
That you do not understand mathematics does not mean that no one understands it. I understand it! I'm not just blindly regurgitating something I've been taught.
 
  • Like
Likes weirdoguy and docnet
  • #4
First of all, thank you for your reply!! I have nothing but respect for your contributions in this forum. I really needed to read those words as I'm going back to school after a long time away. I've been spending the last few hours reading wikipedia, PF, and math overflow about related topics and riding a rollercoaster of worry, delight, fascination, clarity, and confusion. I wish I had the time and brain to learn and understand every word and math topic I'm coming across, though it is unlikely given how specialized and numerous they are. It makes me glad that there are places online to learn from others' questions!
 
  • Like
Likes Greg Bernhardt, phinds and PeroK
  • #5
One major problem with your approach to the problem is that the number of 0's and 1's would depend so much on the order that they are listed in. That doesn't really make sense.

0.1001001001... You say it has more 0's than 1's.
Reorder to get: 0.101010101.... You would say that there are equal number of 0's and 1's.
Reorder some more to get 0.110110110110... You would say that there are more 1's than 0's.

That doesn't make sense.
 
  • Like
Likes WWGD
  • #6
It would be interesting to see if we could obtain a contradiction from assuming the two cardinalities here, of 1s and 0s, are different.
 
  • #7
There are more measures of set "size" in mathematics than just cardinality. None of them entirely matches what our naive expectations might prefer.

There is "cardinality", of course. It is quite general. One can compare arbitrary sets to decide which is of greater or equal cardinality. It is rather course-grained and fails to distinguish between sets that seem like they should be larger than other sets.

There is "asymptotic density" or "natural density". This allows one to define a numeric figure in the range from 0 to 1 for the size of some subsets of the natural numbers. Not all subsets of the natural numbers have an asymptotic density. Consider, for instance, the decimal expansion:$$1.00111100000000111111111111111100000000000000000000000000000000...$$where the number of consecutive ones and zeroes keeps doubling.

There are various measures such as the Lesbesgue Measure that allow one to define a numeric figure in the range from 0 to 1 for the size of some subsets of the real numbers or of an n-dimensional space. Not all subsets of such a space are measurable. This leaves room for the Banach-Tarski paradox.
 
  • Like
  • Informative
Likes CalcNerd, WWGD, berkeman and 1 other person
  • #8
jbriggs444 said:
There are more measures of set "size" in mathematics than just cardinality. None of them entirely matches what our naive expectations might prefer.

There is "cardinality", of course. It is quite general. One can compare arbitrary sets to decide which is of greater or equal cardinality. It is rather course-grained and fails to distinguish between sets that seem like they should be larger than other sets.

There is "asymptotic density" or "natural density". This allows one to define a numeric figure in the range from 0 to 1 for the size of some subsets of the natural numbers. Not all subsets of the natural numbers have an asymptotic density. Consider, for instance, the decimal expansion:$$1.00111100000000111111111111111100000000000000000000000000000000...$$where the number of consecutive ones and zeroes keeps doubling.

There are various measures such as the Lesbesgue Measure that allow one to define a numeric figure in the range from 0 to 1 for the size of some subsets of the real numbers or of an n-dimensional space. Not all subsets of such a space are measurable. This leaves room for the Banach-Tarski paradox.
Talking about measures, the Lebesgue measure itself distinguishes sets of the same cardinality. For one, the Cantor set is uncountably-infinite and of measure 0, while the Real line itself is uncontably-infinite, albeit of infinite measure.
 

Similar threads

Replies
8
Views
1K
Replies
4
Views
621
  • General Math
Replies
4
Views
1K
Replies
4
Views
418
  • General Math
Replies
6
Views
3K
Replies
12
Views
3K
  • General Math
Replies
1
Views
1K
Replies
1
Views
939
Replies
4
Views
1K
Replies
8
Views
14K
Back
Top