Computational learnability: jump from finite to infinite

Click For Summary

Discussion Overview

The discussion centers on the concept of computational learnability, particularly the transition from finite to infinite processes in machine learning. Participants explore the implications of foundational theories, such as ZFC and the Continuum Hypothesis, and seek to understand the intuition behind extending results from finite sets to infinite sets.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • Some participants express confusion about how results regarding finite processes can be applied to infinite domains, particularly in the context of machine learning.
  • There are references to the limitations of ZFC as a foundation for machine learning, with specific mention of the Continuum Hypothesis and its implications.
  • Floating point arithmetic is discussed as a method of representing rational numbers as approximations of real numbers, with some participants noting that this representation is inherently limited.
  • Some participants argue that both computers and humans are constrained to finite descriptions of sets, raising questions about the nature of infinity in computational contexts.
  • There is a discussion about the representation of numbers in computers, with some participants suggesting that any number described in a forum post can be stored in a computer.
  • Concerns are raised about the comprehensibility of infinity when expressed through finite strings of symbols, with some participants suggesting that infinity may not be truly infinite or comprehensible.
  • Participants debate the usefulness of thinking about infinity in terms of patterns, with distinctions made between rational and irrational numbers in their decimal representations.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the nature of infinity or the implications of computational learnability. Multiple competing views remain regarding the representation of numbers and the transition from finite to infinite processes.

Contextual Notes

Limitations include the dependence on definitions of infinity and finite sets, as well as unresolved mathematical steps related to the application of results from finite to infinite domains.

nomadreid
Gold Member
Messages
1,772
Reaction score
255
TL;DR
Starting on basics of computational learnability, I am missing the key intuition that allows results about finite processes to reference results from the infinite domain.
Summary: Starting on basics of computational learnability, I am missing the key intuition that allows results about finite processes to reference results from the infinite domain.

Snowball:
(a) Hazard brought my attention to a popular article https://towardsdatascience.com/five-machine-learning-paradoxes-that-will-change-the-way-you-think-about-data-e100be5620d7, which had several blatant errors, so
(b) I went to the sources https://www.nature.com/articles/s42256-018-0002-3#ref-CR25 and
(c) https://arxiv.org/abs/1711.05195
which discussed ideas that I had no idea about, so
(e) clicking wildly on Wikipedia.

I did not mean to get in this deeply, and I have no intention of devoting much time to it: I wish to get the main ideas; therefore it is possible that my question is too vague, for which I apologize in advance.

The upshot (most clearly expressed in (c)) is that ZFC is not a sufficient basis for machine learning, which is not surprising; what is more curious is the use of the Continuum Hypothesis (CH). In attempting to see how the authors extend their results to be able to invoke the independence of CH, I find that I am missing a key bit of intuition. I would be grateful for an enlightening explanation of the following:

In the above sources, I find constant references to finite sets, to their maxima, to their cardinality, and to approximations of convergence. Approximations are the order of the day, with some precise results in the form of bounds. The only references to infinity at first are in the limits, and the idea that the functions are over the real numbers (even though, as I understand it, computers only actually use rationals), or over the collection of all finite sets. Then various results are extended to aleph-null, aleph-one, and the cardinality of the reals, bringing it into the realm of the CH.

Knowing that one must be careful when extending results about the finite over to the infinite, and from approximations to exact results, I am trying to find the justification of this jump (if possible, without working through each and every equation in the papers). Could someone enlighten me (knowing that I am presently close to zero in the theory of machine learning, having only begun to read into this field)? I will take into account that any answer on this level is bound to be a simplification, but that is fine with me at this point.

[Moderator's note: Link removed which breached copyright.]
 
Last edited by a moderator:
  • Like
Likes   Reactions: pat8126 and atyy
Technology news on Phys.org
Thread is in Moderation temporarily while dealing with some copyright issues...

Reopened after removing a link which violated copyrights.
 
Last edited by a moderator:
Thank you, pat8126. The .problem remains: floating point arithmetic is an arithmetic of the rational numbers as approximations to reals, and some systems reference a few algebraic numbers as fixed constants, i.e., countable sets, and in fact further constraints keep the sets finite. The infinite, on the other hand, seem to be dealt with by humans. If we are to subscribe to the Church-Turing thesis, then either computers should not be constrained to deal only with finite sets, or humans should be.
 
  • Like
Likes   Reactions: pat8126
nomadreid said:
then either computers should not be constrained to deal only with finite sets, or humans should be.
Humans are constrained to deal with finite descriptions of sets, just as computers are.

"##\infty##" is a finite string.
 
  • Like
Likes   Reactions: pat8126, nomadreid, berkeman and 1 other person
nomadreid said:
Thank you, pat8126. The .problem remains: floating point arithmetic is an arithmetic of the rational numbers as approximations to reals, and some systems reference a few algebraic numbers as fixed constants, i.e., countable sets, and in fact further constraints keep the sets finite. The infinite, on the other hand, seem to be dealt with by humans. If we are to subscribe to the Church-Turing thesis, then either computers should not be constrained to deal only with finite sets, or humans should be.
The concept of infinity only suggests a pattern that potentially repeats forever, because no apparent reason exists for it to halt in the future.

If the universe were finite, the actual stopping point of a sequential pattern of numbers would be a rational number that no longer transforms under the rules of the previous infinite function.

If the universe actually continued forever, the best humans could do, as jbrigs444 mentioned, would be to round away the irrational values to a meaningful but inherently arbitrary level of precision.
 
  • Like
Likes   Reactions: nomadreid
nomadreid said:
The .problem remains: floating point arithmetic is an arithmetic of the rational numbers as approximations to reals, and some systems reference a few algebraic numbers as fixed constants, i.e., countable sets, and in fact further constraints keep the sets finite. The infinite, on the other hand, seem to be dealt with by humans. If we are to subscribe to the Church-Turing thesis, then either computers should not be constrained to deal only with finite sets, or humans should be.
As @jbriggs444 suggested, there are many ways to represent a number in a computer. In fact, any number that you can describe in a forum post will be a number stored in computers - because those are what we are using to communicate right now.

Floating point numbers can store some rational numbers precisely and some not-so-precisely. They will never get 1/3 perfect - unless you do something extraordinary such as implementing base-3 floating point arithmetic.
So more precisely: "Floating point arithmetic is an arithmetic of a subset of the rational numbers as approximations to other rational numbers and reals.

In a Turing test, the computer would not be constrained to floating point arithmetic. In fact, it should probably avoid using such skills since they would be a dead give-away to its non-human nature.
 
  • Like
Likes   Reactions: pat8126 and nomadreid
My thanks to jbriggs444, pat8126, and .Scott.
I guess the key point is that, if we think of infinity or infinitesimals as expressed by finite strings of syntactical symbols, then the semantic (model, interpretation) entity that satisfies that string is either not itself actually infinite or isn't comprehensible. Put another way: we can point to infinity but we are either pointing to either another symbol or we are pointing at a blank spot. Even shorter: infinity is just a blotch on paper.
 
  • Like
Likes   Reactions: pat8126
nomadreid said:
My thanks to jbriggs444, pat8126, and .Scott.
I guess the key point is that, if we think of infinity or infinitesimals as expressed by finite strings of syntactical symbols, then the semantic (model, interpretation) entity that satisfies that string is either not itself actually infinite or isn't comprehensible. Put another way: we can point to infinity but we are either pointing to either another symbol or we are pointing at a blank spot. Even shorter: infinity is just a blotch on paper.
I concur.
 
  • Like
Likes   Reactions: nomadreid
  • #10
pat8126 said:
The concept of infinity only suggests a pattern that potentially repeats forever, because no apparent reason exists for it to halt in the future.
I'm not sure that thinking about this in terms of patterns is useful. Speaking of real numbers, the decimal representation of 1/3 is 0.333..., with this pattern repeating endlessly. Certainly the decimal representation has a pattern that repeats endlessly, but the representation is of a finite number. The decimal representation of ##\pi## is infinitely long, but there is no recognizable pattern, even if we go out several billion decimal digits.

Every rational number has a finite-length pattern that repeats in its decimal representation. No irrational number exhibits any such pattern.

.Scott said:
Floating point numbers can store some rational numbers precisely and some not-so-precisely. They will never get 1/3 perfect
Which would we expect, since the decimal representation of 1/3 repeats endlessly. More surprising is that fraction such as 1/5 and 1/10, whose decimal representations are 0.2 and 0.1 are also not stored precisely on computers using binary fraction representations. For example, the expression 0.2 + 0.2 + 0.2 + 0.2 + 0.2 comes out to a result a little bit less than 1 in any programming language that uses the usual floating point arithmetic, i.e., the IEE Standard for Floating Point Arithmetic, IEE-754.
 
  • #11
Mark44 said:
Which would we expect, since the decimal representation of 1/3 repeats endlessly. More surprising is that fraction such as 1/5 and 1/10, whose decimal representations are 0.2 and 0.1 are also not stored precisely on computers using binary fraction representations. For example, the expression 0.2 + 0.2 + 0.2 + 0.2 + 0.2 comes out to a result a little bit less than 1 in any programming language that uses the usual floating point arithmetic, i.e., the IEE Standard for Floating Point Arithmetic, IEE-754.
To be precise, the common floating point formats are IEEE-754 binary32 and binary64.
As their names suggest, they are binary implementations, not decimal. When using binary numbers with bits to the right of the binary point, denominators that are not powers of two can result in repeating bit patterns. So, although 1/5 and 1/10 will not result in repeating decimals, they will result in repeating binaries. More specifically: 1/101 = 0.001100110011...
 
  • #12
.Scott said:
1/101 = 0.001100110011...
To be clear, the fraction on the left is in binary. In base-10 it would be 1/5.
 
  • Informative
Likes   Reactions: Ibix
  • #13
Mark44 said:
The decimal representation of ππ\pi is infinitely long, but there is no recognizable pattern
As @Mark44 surely knows, much depends on what sorts of "patterns" one will accept. The decimal representation of pi is strongly patterned in the sense of Chaitin randomness. There are finite algorithms which can succeed in generating any desired digit.

##\pi## is believed to be free of patterns in the sense that every possible finite sequence of decimal digits occurs in the decimal expansion of pi with the same frequency as every other sequence of that length.
 
  • #14
jbriggs444 said:
##\pi## is believed to be free of patterns in the sense that every possible finite sequence of decimal digits occurs in the decimal expansion of pi with the same frequency as every other sequence of that length.
That was the sense I meant in my response.
 
  • #15
Take for instance the irrational number represented by 22 divided by 7.

The "pattern" is continuing to divide by 7, to any arbitrary level of precision.
 
  • #16
pat8126 said:
Take for instance the irrational number represented by 22 divided by 7.
##\frac {22}7## is very much a rational number, being the ratio of two integers. It's not a coincidence that the two words are related. An irrational number cannot be represented as the quotient of two integers; i.e., is not a ratio.
pat8126 said:
The "pattern" is continuing to divide by 7, to any arbitrary level of precision.
The "pattern," as it is usually understood, is the repeating sequence 142857 in the decimal representation, not that we are continuing to divide by 7.
 
  • Like
Likes   Reactions: jbriggs444
  • #17
nomadreid said:
I guess the key point is that, if we think of infinity or infinitesimals as expressed by finite strings of syntactical symbols, then the semantic (model, interpretation) entity that satisfies that string is either not itself actually infinite or isn't comprehensible. Put another way: we can point to infinity but we are either pointing to either another symbol or we are pointing at a blank spot. Even shorter: infinity is just a blotch on paper.
Not sure what you're getting at here. As @jbriggs444 said earlier, '"##\infty##" is a finite string.' We can point to this symbol and get the idea that it represents a number larger than any number we could possibly think of. I don't see your match between the symbol (a blotch on a paper) and what it represents.
 
  • #18
Mark44 said:
'"##\infty##" is a finite string.' We can point to this symbol and get the idea that it represents a number larger than any number we could possibly think of.
Some caution here: Infinities are not ordinal numbers. Calling them "larger numbers" is a little sketchy. For example, you cannot put them into an equation as you would with common numbers.
But short of that, we can define a function ##G(0)=1; For \ n>0, G(n) = 10^{G(n-1)}##. Then consider G(G(G(100))).
 

Similar threads

Replies
29
Views
6K
  • Sticky
  • · Replies 13 ·
Replies
13
Views
8K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 5 ·
Replies
5
Views
421
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 70 ·
3
Replies
70
Views
5K