Rationale of infinitely countable sets

  • Thread starter Thread starter simeonz
  • Start date Start date
  • Tags Tags
    Sets
simeonz
Messages
14
Reaction score
0
What is the nature of countable infinitity? (warning: pseudo-philosophical question follows)

I can illustrate the question better in specific context:

Axiom for the existence of successor operation that is capable of "counting" the set of natural numbers. Is there some phenomenon that corresponds to counting as such. (A measure for other processes of engineering significance.)

In the Turing machine, the computation can evolve at each step by subsuming further finite amounts from inexhaustable pool of memory. How should I understand the "memory" metaphor here? Is this something technological, natural, or conceptual? For example, the domain of our algorithmic ideas (e.g. addition of positional numerals can be applied to arbitrarily large numbers in principle).

In denotational semantics of recursive programs, a certain rule (operator) extends each operation from a finite domain to a slightly larger finite domain. The transition into infinity is straightforward mentally, as generalization of the finite cases, but what are the technical provisions? I mean, computers are finite-state automaton, and the construction of such operator is only conceptual there.

In summary. What is the contemporary point of view on the subject?

- That our computations might actually scale indefinitely, since countable infinity is conceivable in practice

- That our computational methods and algorithms provide us only with convenient approximations of solutions to problems that may arise, albeit knowing that nothing in our social, mental, or physical reality is countably infinite

(I am a programmer.)

Thanks for your attention
 
Mathematics news on Phys.org
simeonz said:
What is the nature of countable infinitity? (warning: pseudo-philosophical question follows)

I can illustrate the question better in specific context:

Axiom for the existence of successor operation that is capable of "counting" the set of natural numbers. Is there some phenomenon that corresponds to counting as such. (A measure for other processes of engineering significance.)
Well successively adding ones corresponds to counting. The rest is too vague to answer.
In the Turing machine, the computation can evolve at each step by subsuming further finite amounts from inexhaustable pool of memory. How should I understand the "memory" metaphor here? Is this something technological, natural, or conceptual? For example, the domain of our algorithmic ideas (e.g. addition of positional numerals can be applied to arbitrarily large numbers in principle).
It is conceptual, as there is no infinite memory band, nor a TM. It is a model to describe algorithms. As we usually deal with algorithms that come to a hold after finitely many steps, each individual case doesn't need the entire band. It is just that we do not want to say in advance how much memory will be needed.
In denotational semantics of recursive programs, a certain rule (operator) extends each operation from a finite domain to a slightly larger finite domain. The transition into infinity is straightforward mentally, as generalization of the finite cases, but what are the technical provisions? I mean, computers are finite-state automaton, and the construction of such operator is only conceptual there.
See above. However, if we say: The ordinary algorithm for matrix multiplication takes ##O(n^3)## steps, then we need infinity, as ##n## gets larger and larger and we do not want to use different TM. In reality you probably do not want to multiply ##10^{20} \times 10^{20}## matrices.
In summary. What is the contemporary point of view on the subject?

- That our computations might actually scale indefinitely, since countable infinity is conceivable in practice

- That our computational methods and algorithms provide us only with convenient approximations of solutions to problems that may arise, albeit knowing that nothing in our social, mental, or physical reality is countably infinite

(I am a programmer.)

Thanks for your attention
We have to distinguish real life algorithms and models to quantify the needs of an algorithm in space and time. In real life the I/O are the problem anyway, and rarely space.
 
Insights auto threads is broken atm, so I'm manually creating these for new Insight articles. In Dirac’s Principles of Quantum Mechanics published in 1930 he introduced a “convenient notation” he referred to as a “delta function” which he treated as a continuum analog to the discrete Kronecker delta. The Kronecker delta is simply the indexed components of the identity operator in matrix algebra Source: https://www.physicsforums.com/insights/what-exactly-is-diracs-delta-function/ by...
Fermat's Last Theorem has long been one of the most famous mathematical problems, and is now one of the most famous theorems. It simply states that the equation $$ a^n+b^n=c^n $$ has no solutions with positive integers if ##n>2.## It was named after Pierre de Fermat (1607-1665). The problem itself stems from the book Arithmetica by Diophantus of Alexandria. It gained popularity because Fermat noted in his copy "Cubum autem in duos cubos, aut quadratoquadratum in duos quadratoquadratos, et...
I'm interested to know whether the equation $$1 = 2 - \frac{1}{2 - \frac{1}{2 - \cdots}}$$ is true or not. It can be shown easily that if the continued fraction converges, it cannot converge to anything else than 1. It seems that if the continued fraction converges, the convergence is very slow. The apparent slowness of the convergence makes it difficult to estimate the presence of true convergence numerically. At the moment I don't know whether this converges or not.
Back
Top