I The reason for lambda calculus being universal

askingask
Messages
90
Reaction score
5
Is the fact, that the sequence of arguments and functions matter, for lambda calculus, the reason it's universal. I thought about it because expressions like true and false depend on the idea that the order in which the arguments get substituted matters. Keep in mind that I have zero background in this field so please keep the answers informal enough so I can understand, thank you.
 
Physics news on Phys.org
askingask said:
Is the fact, that the sequence of arguments and functions matter, for lambda calculus, the reason it's universal.

The question does not make sense: there is no "reason that lambda calculus is universal".

Having said that, how useful do you think a formalism where the ordering of symbols does not matter would be?
 
pbuk said:
Having said that, how useful do you think a formalism where the ordering of symbols does not matter would be?
I see what you mean.


pbuk said:
The question does not make sense: there is no "reason that lambda calculus is universal".
But like aren't there certain rules, that if you take them out, it looses it's "universality". I just want to understand what's special about this set of rules compared to other ones that aren't universal.
 
askingask said:
Is the fact, that the sequence of arguments and functions matter, for lambda calculus, the reason it's universal. I thought about it because expressions like true and false depend on the idea that the order in which the arguments get substituted matters. Keep in mind that I have zero background in this field so please keep the answers informal enough so I can understand, thank you.
I think what you want to understand is the Church–Turing thesis, there might be popular accounts of that you can read. Please remember that in PF you are required to show that you did some effort in understanding the question before asking it.
 
askingask said:
But like aren't there certain rules, that if you take them out, it looses it's "universality".
Universality needs to be defined. I assume you mean Turing completeness. But ...
The untyped lambda calculus is Turing-complete, but many typed lambda calculi, including System F, are not.
... so we even need a closer look at which Lambda system you mean! Your question already needs nonbasic considerations!

For example, Wikipedia says
##\lambda##-terms tend to formulate general principles of mathematics rather than denote objects of the usual mathematical universe. For example, ##\lambda x.x## formulates the mapping principle of identical mapping, but this is always related to a given set as the domain of definition. A universal identity as a function is not defined in the set-theoretic formulation of mathematics. The lambda calculus in the strict sense should therefore be seen as a redesign of mathematics in which the basic objects are understood as universal functions, in contrast to axiomatic set theory, whose basic objects are sets.

I wasn't familiar with the Lambda calculus so I looked it up on Wikipedia. What is written there is quite interesting. I suggest to read
https://en.wikipedia.org/wiki/Lambda_calculus
and
https://de.wikipedia.org/wiki/Lambda-Kalkül
The latter can be translated by using Chrome and right-clicking plus the option "Translate to English".
The versions are different so it makes sense to read both. For a rigorous, whereas more sophisticated version see
https://ncatlab.org/nlab/show/lambda-calculus

All these explanations, however, couldn't answer the question: Why do you bother? You said you wanted an answer on a layman level, but the subject itself leads directly into the rabbit hole of formal languages and / or computer science.
Lambda calculus is a formal language for studying functions. It describes the definition of functions and bound parameters and was introduced in the 1930s by Alonzo Church and Stephen Cole Kleene. Today, it is an important construct for theoretical computer science, higher-order logic, and linguistics.

Lambda calculus is a formal language from a mathematical point of view and rather exotic than common.

Lambda calculus is a computation class from a computer science point of view. Some other concepts ("Konrad Zuse incorporated ideas from the lambda calculus into his Plankalkül from 1942 to 1946.") and programming languages (Lips, Haskell, ML) use principles of it.

Lambda calculus can be helpful in linguistic, aka semantics.
Semantics is the branch of linguistics that analyzes the meaning of natural language expressions. Formal semantics initially uses simple tools from predicate logic and set theory. These are then extended with the fundamentals of lambda calculus, for example, to represent propositions as properties using lambda abstraction and to represent more complex noun phrases, adjectival phrases, and some verb phrases. The basis is, for example, a model-theoretic semantic interpretation of Richard Montague's intensional logic.

I cannot see how any of these viewpoints can be treated on a basic level. Even its (Lambda calculus's) own history is far from being easily explainable:
Wikipedia said:
In the 1930s, he [Alonzo Church] became known to his mathematical and logic colleagues for a universal formal model for computations, the lambda calculus, which he developed as part of his research into the foundations of mathematics and Gödel's incompleteness theorems. Data and operators are embedded in the lambda calculus using Church coding, and natural numbers are represented by Church numerals. In 1936, Church demonstrated that for two given expressions in the lambda calculus, there is no computable function that can decide whether they are equivalent or not, thus, problems that are undecidable using number theory (Church's theorem); two equivalent expressions can be transformed into one another or reduced to the same normal form (Church-Rosser theorem). This inspired his student Alan Turing's reflections on the halting problem of a machine performing arithmetic operations. Church and Turing then discovered that the lambda calculus and the Turing machine are equivalent models for the decision problem; a derived notion of computability is known as the Church-Turing thesis.

In the field of philosophy, he is known for his highly argumentative defense of the Platonic position in the modern Universals Controversy.
https://de.wikipedia.org/wiki/Alonzo_Church#Werk
 
askingask said:
But like aren't there certain rules, that if you take them out, it looses it's "universality".

There are only three "rules" to lambda calculus. If you take any one of them out then it no longer does anything at all.

askingask said:
I just want to understand what's special about this set of rules compared to other ones that aren't universal.

Again, that is not a meaningful question: as @fresh_42 says:

fresh_42 said:
You said you wanted an answer on a layman level, but the subject itself leads directly into the rabbit hole of formal languages and / or computer science.

... in particular computability theory. The Wikipedia page on computability theory provides a reasonable introduction to this topic for the layman and provides most of the necessary context without which any attempt to understand the significance of lambda calculus or the Church-Turing thesis will be fruitless.
 
Wow very thorough
fresh_42 said:
Universality needs to be defined. I assume you mean Turing completeness.
Could one not argue also that the turing machine is "lambda-calculus complete" or something if they are equivalent according to the Church-Turing thesis? I just feel like there are underlying patterns that both might share. For example there is also "rule 110" which is a cellular automaton known to be turing complete. The same way rule 110 could simulate a turing machine, a turing machine could simulate rule 110 making it "rule 110 complete" and so on, the list goes on.

Church,[7] Kleene,[8] and Turing[9][11] proved that these three formally defined classes of computable functions coincide: a function is λ-computable if and only if it is Turing computable, and if and only if it is general recursive. This has led mathematicians and computer scientists to believe that the concept of computability is accurately characterized by these three equivalent processes. Other formal attempts to characterize computability have subsequently strengthened this belief
https://en.m.wikipedia.org/wiki/Church–Turing_thesis

That is why I asked the question at the start of the thread. The idea of encoding numbers and even operations in lambda calculus with these expressions and rules was really something different. But then I realized that the idea of having symbols and a set of rules that describe how they interact, isn't that special, because really most of mathematics deals with that right? Like if I take peano arithmetic for example, it is turing complete and again the Turing-Machine is "peano-arithmatic complete". With all these different models and rules, that are all quite equally capable, my mind would tell me that this shared characteristic could be somehow pinpointed to a common shared phenomenon.

I feel like everyone in the thread either didn't understand what I was trying to say or I didn't understand what they were trying to tell me. Is there a flaw in my thinking? Or have I gotten it completely wrong?
 
askingask said:
Could one not argue also that the turing machine is "lambda-calculus complete" or something if they are equivalent according to the Church-Turing thesis?

I just feel like there are underlying patterns that both might share.

Perhaps: both are effective methods for solving a certain class of problems (computing computable numbers).

For more about this read Turing's 1936 paper On Computable Numbers.

Edit: or even better, instead of diving in to the deep end without first learning to swim study a course in computability theory e.g. https://www.cl.cam.ac.uk/teaching/2021/CompTheory/comt-notes.pdf.

askingask said:
Could one not argue also that the turing machine is "lambda-calculus complete"
...
"rule 110 complete" and so on, the list goes on.

Yes, the list goes on ad infinitum. It is very inconvenient to have an infinite list of names for the same thing so since about 1936 everybody has agreed to use the term "Turing Complete".

askingask said:
With all these different models and rules, that are all quite equally capable, my mind would tell me that this shared characteristic could be somehow pinpointed to a common shared phenomenon.

Not without tautology - the shared characteristic IS the shared phenomemon: all Turing complete systems are Turing complete. Perhaps you can understand the triviality of what you are trying to seek by substituting the words "capable of computing any computable number" for "Turing compete": "all systems that are capable of computing any computable number can compute any computable number".

askingask said:
I feel like everyone in the thread either didn't understand what I was trying to say or I didn't understand what they were trying to tell me.

This is evidently true. Given that you have "zero background in this field" which do you think is more likely?
 
pbuk said:
Not without tautology - the shared characteristic IS the shared phenomemon: all Turing complete systems are Turing complete. Perhaps you can understand the triviality of what you are trying to seek by substituting the words "capable of computing any computable number" for "Turing compete": "all systems that are capable of computing any computable number can compute any computable number".
Are you saying that the rules that govern a Model are the models capabilities?
 
  • #10
I don't know, for example, how closely the version of "untyped" lambda calculus [as, for example, described in notes linked in post#8] corresponds to the original one. Nevertheless, years ago I remember reading an interview. I don't remember it exactly, but this is what I re-call. Kleene mentioned that it was not initially fully clear whether it was possible to implement the predecessor function on natural numbers. And it took a while for him to figure it out. After that Church assigned more complex functions (involving some recursions) to him. And so on...

===========

Regarding peano arithmetic, I am not sure in what sense we can say that it computes a collection of functions. What one can say though is that the set representing its theorems is computably enumerable [that is, a c.e. set].
 
Last edited:
  • Like
Likes pbuk and fresh_42
  • #11
I was looking at stevendaryl's informative posts related to this from few years back. It occurs to me that there is at least one way to think of associating a collections of sets [each set being subset of ##\mathbb{N}##] with ##PA##. One way is to look at all predicates of form ##P(x)## where ##x## is the only open variable in the predicate. Then by looking at the truth value of ##P## for each natural number ##x## we can naturally associate a subset of ##\mathbb{N}## with the predicate ##P##.

The resulting collection of sets [by considering the sets corresponding to all the predicates of form ##P(x)##] will be the "arithmetic sets". They will be much bigger than collection of computable sets (or even c.e. sets).


P.S.
I was also thinking about what could be the best way to associate (total) functions with sets in a reasonable way. If we consider an arbitrary set ##A \subseteq \mathbb{N}##, one way seems to be look at it seems to be to write the set in (strictly) increasing order as: ##A=\{a_0,a_1,a_2,.....\}##. Then we could define define a function ##f:\mathbb{N} \rightarrow \mathbb{N}## as:
##f(0)=a_0##
##f(1)=a_1-a_0-1##
In general, for an arbitrary ##n \in \mathbb{N}## with ##n \geq 1##, we could set:
##f(n)=a_n-a_{n-1}-1##

For the case where ##A## has finite number of elements perhaps we could set ##f## to be ##0## [as a default of sorts] for all inputs where the input to function is greater equal to number of elements in ##A##.

Is this way of association fully satisfactory though? I am not sure. Corresponding to computable sets, this correspondence will yield computable functions. Corresponding to c.e. sets, this correspondence will yield a bigger collection than computable functions (but still halt-computable).
 
Last edited:
Back
Top