Conjecture:fundamental mathematical group

  • Thread starter Thread starter SW VandeCarr
  • Start date Start date
  • Tags Tags
    Group Mathematical
AI Thread Summary
The discussion centers on the proposition that a fundamental mathematical group exists, consisting of non-negative integers and irrational numbers, with addition and subtraction as operations. The original claim faced challenges regarding the validity of this group, particularly concerning the closure under subtraction and the exclusion of negative integers. The conversation also delves into the philosophical distinction between "discovered" mathematical objects, like natural numbers and irrationals, and "invented" ones, such as negative integers and set theory. Participants emphasize the historical evolution of mathematical concepts and the necessity of rigorous definitions in mathematics. Ultimately, the thread highlights the complexity of defining foundational mathematical structures and the philosophical implications of such distinctions.
SW VandeCarr
Messages
2,193
Reaction score
77
I'm posting here because I can't prove what I say, but I believe I can defend it on a case by case basis. I hope to attract math oriented members to challenge me.

The mathematician Leopold Kronecker said "God made the integers; everything else is the work of man." http://en.wikipedia.org/wiki/Leopold_Kronecker

Well, I'm not speculating on the source of the integers but I am proposing the following:

1. There is a group which is fundamental to all mathematics. Everything else is derivative. I'm excluding Set Theory and Logic, and considering geometry to be derivative because it can be expressed algebraically and algebra is derivative.

2. The objects of the group are the non-negative integers and the irrational numbers.

3. The operation over the group is addition with its inverse, subtraction.

4. The identity element is zero.
 
Last edited:
Mathematics news on Phys.org
What do you mean by excluding set theory? Did you ever look into ZFC you can derive your numbers from that, I'd be surprised if you could derive ZFC from your numbers. With your operations \mathbb{N} is not a group by the way \mathbb{Z} is. Why do you even think that \mathbb{N} is somehow fundamental when it doesn't even allow for a decent subtraction...
 
0xDEADBEEF said:
What do you mean by excluding set theory? Did you ever look into ZFC you can derive your numbers from that, I'd be surprised if you could derive ZFC from your numbers. With your operations \mathbb{N} is not a group by the way \mathbb{Z} is. Why do you even think that \mathbb{N} is somehow fundamental when it doesn't even allow for a decent subtraction...

My purpose is not to reconstruct mathematics from ZFC. My purpose is to distinguish fundamental or "natural" objects from defined or "invented" objects. This is the philosophy board. I want to know the minimum number of object classes and operations that would be sufficient to define as much of mathematics as possible as it might be defined to a computer or to describe how mathematics might have evolved. For example: multiplication is serial addition. Division is serial subtraction with remainders or the inverse of multiplication. Algebra is a generalization from arithmetic with constants and variables. Kronecker thought it could all be done with integers only. I included the the irrationals since they can only be approximated by fractions and they help with defining limits (and they are "natural" ie discovered). I've actually done this on a computer just to see how far I could take this. What I really want to hear is someone who can show what part of mathematics cannot be defined from only what I specified.

EDIT:You are correct in that I do not have a group. I originally had "integers" but changed to "non-negative integers" because negative numbers aren't "natural". With all the integers, the group is closed under addition and subtraction.
 
Last edited:
Just a reminder "that our policies for discussion of science and mathematics hold just as strongly in the Philosophy Forums as anywhere else on the site. Overly speculative or incorrect statements within the domains of science and math may be moved, locked, or deleted at the mentors' discretion, and warnings may be issued. In general, there is more legroom for speculation in philosophical discussion, but it must be in the form of a well motivated question or argument, as described above. In particular, even a 'speculative' argument should be logically consistent with well established scientific knowledge and theory."
 
Evo said:
Just a reminder "that our policies for discussion of science and mathematics hold just as strongly in the Philosophy Forums as anywhere else on the site. Overly speculative or incorrect statements within the domains of science and math may be moved, locked, or deleted at the mentors' discretion, and warnings may be issued. In general, there is more legroom for speculation in philosophical discussion, but it must be in the form of a well motivated question or argument, as described above. In particular, even a 'speculative' argument should be logically consistent with well established scientific knowledge and theory."

OK. You read my response to OxDEADBEEF. If you still think this should be removed, please remove it. I believe the development of mathematics from some basic conceptions such counting, the invention of fractions and the discovery of the irrational numbers is a legitimate subject for this forum. So would George Lakoff I'm sure:
http://books.google.com/books?id=RxFRRyiyVt8C&dq=Lakoff+and+Nunez&printsec=frontcover&source=bl&ots=eIW9khufBc&sig=TQggAmYbx8lIt5Do3m-c6wm7GcU&hl=en&ei=RN58Suj9DZD-tQPNq7HvCg&sa=X&oi=book_result&ct=result&resnum=4#v=onepage&q=&f=false
 
Last edited:
The positive integers are not stable by subtraction.
 
humanino said:
The positive integers are not stable by subtraction.

I understand. It's also necessary to include all integers to define a group closed under addition and subtraction. I can't modify my original post, but I accept this argument and have already indicated this to Oxdeadbeef (post 3). Thanks
 
Well the way that you are trying to take, is not very modern and ignores that we have found five lines to express the whole content of the natural numbers:

# There is a natural number 0.
# Every natural number a has a natural number successor, denoted by S(a). Intuitively, S(a) is a+1.
# There is no natural number whose successor is 0.
# Distinct natural numbers have distinct successors: if a ≠ b, then S(a) ≠ S(b).
# If a property is possessed by 0 and also by the successor of every natural number which possesses it, then it is possessed by all natural numbers.

Not so miraculous. The question, what the minimum set of things and relations between them is, so you can construct all mathematics from it, is addressed more or less successfully with Touring machines.

The main part of mathematics has nothing to do with calculating values of functions, but with proofs. Most people only come in contact with calculating, and in that area pretty much everything is based on addition, multiplication and natural numbers (because we only have finite space on the paper) But from a mathematicians point of view this is a very limited part of mathematics.

So to put it in a more defining manner: You claim set theory follows from algebra, which follows from natural numbers. For this claim to hold, everything you write down to prove it, must be nothing but numbers addition and multiplication, NO TEXT! You surely cannot do this. With ZFC we can.
 
0xDEADBEEF said:
Well the way that you are trying to take, is not very modern and ignores that we have found five lines to express the whole content of the natural numbers:

Exactly. That's the whole point. People were using natural numbers long before the present concepts of rigor were developed. This thread is about evolution of mathematics as a process of invention and discovery. One of the distinguishing features of mathematics is that it rarely dispenses with concepts once they are established, unlike the empirical sciences. I began this thread by quoting Kronecker's assertion that only the integers were "natural" and the rest of mathematics was invented. I originally had planned to base the group on the integers and irrationals (which I think Kronecker should have included since we in no way "invented" them). However I changed it at the last minute because the negative integers were a more recent innovation apparently originating in an number of places.

http://www.gobiernodecanarias.org/educacion/3/Usrn/penelope/uk_confboye.htm

They seem to have originated with businessmen looking for ways to express debt. In any case, someone suggested you could subtract seven from three and get -4. Today we regard addition with negative integers and subtraction as distinct operations, but it didn't start off that way. It soon became apparent that you take odd roots of negative integers, but not even roots. These simply didn't exist. The invention of several Italians (Niccolo Tartagalia and others) to write {i} for \sqrt{-1} was suspect for centuries after. You see were I'm going. It's not so much the history of math as the evolution of mathematical thought as model of cognitive extension. That is, how paradigms of thought may be extended or replaced.

Euclid's ideas regarding the axioms of plane geometry were long admired, but not copied until attempts were made in the 19th century. Cantor's highly original Set Theory was not in the historic line of mathematical development at all. Before this, there was only the rather vague idea of classes. It's well known that Set Theory was not accepted right away, and Cantor suffered for it. Set Theory dispenses entirely with traditional and historic mathematics and uses more basic objects and relations. It's because of this that it became a useful metalanguage for formulating definitions of mathematical objects previously assumed to be primitives. I excluded it for this reason. I think Set Theory lies somewhere between traditional number based mathematics and logic.

My basic question is what mathematical objects are discovered and what objects are invented? It's obviously a philosophical question, not a mathematical one. I expect opinions will disagree. Also, given certain discovered objects, how far can pure innovation take us?
 
Last edited:
  • #10
I'm not really sure in what sense you think your group is "fundamental".

You seem to make a point to talk about "irrational numbers" despite the fact you have no way to talk about the concept of irrationality.

e.g. your group is isomorphic to the group of all polynomials with rational coefficients

Another thing missing -- you don't seem to have any means to invoke mathematical induction


P.S. you made a (minor) omission: -- your group must contain the rational numbers, otherwise it's not closed under addition)
 
  • #11
Hurkyl said:
I'm not really sure in what sense you think your group is "fundamental".

You seem to make a point to talk about "irrational numbers" despite the fact you have no way to talk about the concept of irrationality.

e.g. your group is isomorphic to the group of all polynomials with rational coefficients

Another thing missing -- you don't seem to have any means to invoke mathematical inductionP.S. you made a (minor) omission: -- your group must contain the rational numbers, otherwise it's not closed under addition)

I wrote this to invite comment so I suppose you could argue what's fundamental. I was thinking that fundamental objects are discovered and derivative objects are invented. You could argue just what invented and discovered means, but I think Pythagoras did not want to discover that he couldn't express \sqrt{2} as a fraction.

When you talk about my group, it's only a group if I include all the integers. I had originally wrote it this way, but made a last minute change (unfortunately) because I think of the negative integers as derivative (invented by commercial lenders apparently). I think if you consider the group's objects as the integers and irrationals, it works. The rationals were an Egyptian innovation (Egyptian fractions).

You posted before I finished the previous post. I hope it clarifies what I'm talking about. If not, please ask.

EDIT: All integers can be reached by addition and the group of integers is closed under addition, but I think I see your point about the irrationals. It looks like I'm going to have to give up any idea that I have a group, because I was reluctant to include the negative integers and now I have to give up the irrationals unless I take the rationals too. The former are clearly discovered and the latter clearly invented (to my way of thinking).

From from now, I'm just going to say that the natural numbers and the irrationals were discovered and all else were invented. The idea that I can make or not make a group out of this doesn't really affect my philosophical argument.
 
Last edited:
  • #12
SW VandeCarr said:
I think of the negative integers as derivative (invented by commercial lenders apparently).
Have you read Wikipedia's article on negative numbers?

EDIT: Aren't the integers and the irrationals closed under addition?
No: there exist two irrational numbers whose sum is 1/2.

By the way, I fibbed earlier -- I had implicitly assumed you meant only the real algebraic irrational numbers, not all irrational numbers. But the point I was making still stands: you can't do much at all if all you have is real numbers and addition. I'm 99.985% sure you can't even define the notion of "integer"!
 
  • #13
Hurkyl said:
Have you read Wikipedia's article on negative numbers?

I did now. Apparently the Chinese were more sophisticated than the Europeans (no surprise there). But in most cases, according to the article I cited, merchants and money lenders were the probably the first to use negative numbers. By the way, there's some sharp division of opinion on the use of the Wiki. It's described by some as the source that "shall not be mentioned". I think each article needs to be taken on a case by case basis. A well referenced article with no editorial comments should be as good as any. This is stuff for another thread. Where does it belong?
No: there exist two irrational numbers whose sum is 1/2

I also understand there's some real non-zero rational power of {e} that's an integer (or so close it's difficult to tell it isn't an integer). All the more reason to consider the irrationals to be discovered (ie "fundamental") and for "discovered" to be meaningful word.
 
Last edited:
  • #14
SW VandeCarr said:
My basic question is what mathematical objects are discovered and what objects are invented? It's obviously a philosophical question, not a mathematical one. I expect opinions will disagree. Also, given certain discovered objects, how far can pure innovation take us?

Philosophically speaking, I would suggest you are making a basic mistake. You want to define some fundamental mathematical atom - the integer - and construct your way up from this "natural object" to every "derived object" know to maths.

But with the switch from set theory to category theory, it seems that a better way of making progress is to generalise. Which means moving from the local and particular towards the global and universal by the systematic removal of constraints. So for example the shift in level from geometry to topology. Geometry works with specified distances and curvatures. Topology throws away such constraints - measurements added to specify - and so sees reality in a more generalised fashion.

In number theory, you have a similar journey from the original number line, through imaginary numbers, and then increasingly higher dimensional algebras. Specific operations like division cease to work. A more generalised or "topological" view of arithmetic emerges.

So the philosophical question is whether you can start with your event and construct your way up to a view of the context? Can you start with an atom and construct your way up to find the void? Or can you find that larger void, larger context, only by the successive and systematic relaxation of constraints. Relax what is forming the crisp atomness to discover the larger world of which it is a particularly pinched-up and bounded-on-all-sides part.

A focus on integers is of course a healthy thing philosophically speaking as we would want to know the minimal set of constraints (as in peano axioms for instance) that will serve to "produce" them.

So the challenge seems to be not, assume the integer and off we go... But instead what minimal collection of constraints serve to make integers possible?

In this spirit, my own take on irrational numbers has always been that the whole numberline is irrational as it is a continuous line, a geometric object. Then we add constraints to create rationals.

So the number 1 is irrational. It is 1.000... And we can keep specifying extra zeros infinitely. Indeed we would need to to be sure that what we thought was exactly one does not go squirrelly on us and turn out to be 1.000...1 or some other location on the numberline.

For the sake of argument, we can axiomatically take 1 to be 1. But philosophically, we can see how it is also the result of a process of applying a constraint on free dimensionality.
 
Last edited:
  • #15
SW VandeCarr said:
I also understand there's some power of {e} that's an integer (or so close it's difficult to tell it isn't an integer). All the more reason to consider the irrationals to be discovered (ie "fundamental") and for "discovered" to be meaningful word.

Yes... e^0=1 or e^{i\pi}=-1. In fact there are infinitely many of them. There is a lot of (in my opinion, useless) debate about maths being discovered or invented. I suppose by discovery you mean that it is something that is in nature that would lead to such construction. I myself have never seen an irrational number in nature, partly due to the fact that humans cannot observe infinity. Construction of an irrational number would need the axiom of infinity (which states that an infinite set exists), which I find debatable as something that is in nature.

Even worse, irrational numbers are uncountable. How on Earth could you say that this is natural?
 
  • #16
Focus said:
Yes... e^0=1 or e^{i\pi}=-1. In fact there are infinitely many of them. There is a lot of (in my opinion, useless) debate about maths being discovered or invented. I suppose by discovery you mean that it is something that is in nature that would lead to such construction. I myself have never seen an irrational number in nature, partly due to the fact that humans cannot observe infinity. Construction of an irrational number would need the axiom of infinity (which states that an infinite set exists), which I find debatable as something that is in nature.

Even worse, irrational numbers are uncountable. How on Earth could you say that this is natural?

Actually, I was thinking it was a real rational power. I've gone back and edited that.

Regarding discovering irrationals, Pythagoras was surprised and apparently not too happy in discovering them. They certainly weren't in any sense invented. Number theory is all about discovering new properties of integers, particularly primes. Research in NT has a resemblance to research in the physical sciences in that you don't know what you might find. If you don't like "natural" just say "discovered" or not invented

Yes, we can never write out an irrational number, but we have finite algorithms for calculating them to whatever precision we need up to technical limits. Correct me if I'm wrong, but I understand the published algorithms for pi and e have been proven to converge.
 
Last edited:
  • #17
SW VandeCarr said:
Yes, we can never write out an irrational number,
\pi
\sqrt{2}
e
x, where x is a variable whose domain is R\Q
 
  • #18
SW VandeCarr said:
Regarding discovering irrationals, Pythagoras was surprised and apparently not too happy in discovering them. They certainly weren't in any sense invented. Number theory is all about discovering new properties of integers, particularly primes. Research in NT has a resemblance to research in the physical sciences in that you don't know what you might find. If you don't like "natural" just say "discovered" or not invented

Yes, we can never write out an irrational number, but we have finite algorithms for calculating them to whatever precision we need up to technical limits. Correct me if I'm wrong, but I understand the published algorithms for pi and e have been proven to converge.

Pythagoras was long dead when they came across irrational numbers. I am certain that we will not observe something infinite, as we can only exist and observe for a finite time, thus all we can observe is something that is very big.

My concern is not about writing out irrational numbers. Just call them x,y. It doesn't make a difference. It is the fact that there are uncountably many of them. I highly doubt that uncountably infinite is something that people will say is natural (in the sense of nature), though I admit they may be natural in mathematics. It is natural, assuming ZFC that we construct the naturals. Then knowing a bit about groups it is natural to construct integers. Knowing about fields it becomes natural to construct rationals, and using completion of a metric space or Dedekind cuts, it is natural to make the reals.But do you actually claim that ZFC was discovered/natural? Where is evidence of, say the axiom of choice or axiom of infinity in nature?

Also if indeed pi and e did converge, they would not be irrational numbers. Who cares if we have algorithms to compute them? They are just elements of a set.
 
  • #19
Focus said:
Also if indeed pi and e did converge, they would not be irrational numbers. Who cares if we have algorithms to compute them? They are just elements of a set.

You can approximate a number like \sqrt{2} by a sequence which converges to a number which when squared converges to two. For pi and e, there is convergence even if we can't specify the quantity exactly in our number system. These are just the kinds of things I'm calling non-derivative. The rationals are derivative because they can be expressed as ratios of integers. Integers are non-derivative.

Set theory is a metalanguage for describing mathematics. I never said ZFC is "natural". Just the opposite. Set Theory was an invention of Gorg Cantor and made rigorous by ZFC (although the choice axiom is suspect for some mathematicians including Penrose). We use languages to describe the world but the description isn't the world.

I'm also puzzled by your considering uncountable infinity with somehow being "unnatural". I don't see the logic here. If that's what we see, that's what we get.
 
Last edited:
  • #20
apeiron said:
Philosophically speaking, I would suggest you are making a basic mistake. You want to define some fundamental mathematical atom - the integer - and construct your way up from this "natural object" to every "derived object" know to maths.

Not really. I'm hardly up for such an ambitious project. I'm saying history has constructed the mathematics we have today. This construct, I'm saying was based on a few discoveries and a lot of innovation. Among the discoveries are the integers and irrational numbers. At one point, one could have said that the elements of geometry (when there was just one geometry) were discovered, but it is now possible to formulate geometries as algebras.

The number 1 is irrational. It is 1.000... And we can keep specifying extra zeros infinitely. Indeed we would need to to be sure that what we thought was exactly one does not go squirrelly on us and turn out to be 1.000...1 or some other location on the numberline.

For the sake of argument, we can axiomatically take 1 to be 1. But philosophically, we can see how it is also the result of a process of applying a constraint on free dimensionality.

This just shows how much mathematics is subject to innovation and fashion. Nevertheless, there is a difference (I believe) between 1.00000... and 1. The former might be taken as just one of an uncountable infinity of real numbers (I don't think you can say just irrational numbers), the latter as just one of a countable infinity of integers.
 
Last edited:
  • #21
I can see roughly what you are trying to achieve here, and I think it might even be an instructive exercise (even if it is, as I believe, doomed to eventual failure) but I find your particular choice of numbers baffling. How can you claim the division of a whole into two equal parts is in some way less natural than, for example, the ratio of a circle's radius to its circumference?

Your argument for what distinguishes derivative from non-derivative concepts is inconsistent. You appear to be saying that irrationals are non-derivative because they cannot be obtained from finitary algebraic combinations of the integers. Yet the integers can be constructed in such a way from just 1. Why are finitary algebraic operations 'derivative' but the taking of limits not?
 
  • #22
SW VandeCarr said:
Not really. I'm hardly up for such an ambitious project. I'm saying history has constructed the mathematics we have today. This construct, I'm saying was based on a few discoveries and a lot of innovation. Among the discoveries are the integers and irrational numbers. At one point, one could have said that the elements of geometry (when there was just one geometry) were discovered, but it is now possible to formulate geometries as algebras.

A physics friend of mine once said, what annoyed him about philosophers, was, that they started asking him about quantum mechanics, and after about 10 minutes they started explaining to him that he was wrong, and explained how quantum mechanics worked. In mathematics we just discover ways of handling things. Not the things themselves.

We found a nice way of writing down numbers, when using rocks to count was not useful anymore. We found that it was useful to define negative numbers, with which we would have nicer properties under subtraction. Before Euclid you had secret rules that were discovered but not proved. But in modern mathematics calling on thing fundamental and another one not fundamental is misguided. The "things" are all there, we can construct anything and (except for a few fundamental problems that are really of little practical concern) everything follows from a small list of rules. Constructing an ordered set of two real numbers with a new type of multiplication has useful properties for doing algebraic factorization. That is the discovery in the complex numbers, not the complex numbers themselves. There was no "i" sitting in the basement that was always the square root of -1 waiting for a discoverer.
You call f(x)=sin(sqrt(sinc(ln(x)))) derived because YOU are not familiar with it. In ZFC it is a construction just like any other, and mathematics doesn't care about human history.
 
  • #23
0xDEADBEEF said:
You call f(x)=sin(sqrt(sinc(ln(x)))) derived because YOU are not familiar with it. In ZFC it is a construction just like any other, and mathematics doesn't care about human history.

I don't know what mathematics "cares about" since I don't think it's a sentient thing or really a "thing" at all. It's a thought process. I'm interested in the thought process. Present day mathematics is the product of a collective historical line of thought. I've studied set theory and know about the laborious process to construct the integers from the empty set, and Peano's axiomization of arithmetic, etc. I have a text on number theory which doesn't use Peano's proof structure. Historically integers came first. Ratios of integers made fractions which could be thought as dividing an integral whole into equal integral (countable) parts and expressing the number of some of those parts in a ratio to the whole number of parts. You're looking from a present day vantage point without trying to understand the process that got us where we are. Fine. If you're not interested, you're not interested. However, though you are obviously well trained in math, I suspect you started your learning process with the addition of natural numbers like everyone else. This is a philosophy form. I didn't start this thread in a mathematics forum for a good reason. I'm not a philosopher, but, among other things, I'm interested in cognitive development, both in the way individuals learn and the way our species collectively learns. From this point of view it's quite reasonable to take the integers and irrationals as non-derivative or "discovered". You may disagree. Fine. I invited challenges and I'm happy to respond to them.

Edit: I think you're also misunderstanding my basic premise. We only discovered the integers and irrationals (in terms of object classes, not individual mathematical results). Everything else was a conscious extension of an existing paradigm. I'm open to challenges here.
 
Last edited:
  • #24
Ravid said:
How can you claim the division of a whole into two equal parts is in some way less natural than, for example, the ratio of a circle's radius to its circumference?

I think the difference is pretty obvious here. We can divide a whole into countable parts any way we want. We can't choose the ratio of the circumference to the diameter of a circle.
 
Last edited:
  • #25
SW VandeCarr said:
1. There is a group which is fundamental to all mathematics. Everything else is derivative. I'm excluding Set Theory and Logic, and considering geometry to be derivative because it can be expressed algebraically and algebra is derivative.

2. The objects of the group are the non-negative integers and the irrational numbers.

3. The operation over the group is addition with its inverse, subtraction.

4. The identity element is zero.
The set with the operation do not form a group. What, for instance, would be the inverse of the non-negative integer 1?
 
  • #26
jimmysnyder said:
The set with the operation do not form a group. What, for instance, would be the inverse of the non-negative integer 1?

Read the rest of the thread. This has all been discussed. The fact is, it's all the integers that are closed as a group under addition. Moreover, the irrationals do not form a group under addition because for any irrational number there is another irrational number such that the sum of the two is a given rational number. It's actually easy to see why. I should have stuck with Kronecker's original assertion rather than trying to "improve it. This, however, is not the main thrust of this thread (at least as I intended it).
 
Last edited:
  • #27
SW VandeCarr said:
I think the difference is pretty obvious here. We can divide a whole into countable parts any way we want. We can't choose the ratio of the circumference to the diameter of a circle.

That's a pretty specious argument. Both involve choices - we can choose to take the ratio of anything we want. For example, we might have taken the ratio of the circle's radius to its diameter.

Now take two atoms (where by atom I mean indivisible thing). How many ways can you 'choose' to divide this pair up? And what is the size of the parts in terms of the original pair?
 
  • #28
Why not take the group {0, 1} under addition as the fundamental mathematical object? The group of integers under addition can be derived from it.
 
  • #29
Ravid said:
That's a pretty specious argument. Both involve choices - we can choose to take the ratio of anything we want. For example, we might have taken the ratio of the circle's radius to its diameter.

This is not a serious argument. It's clear that simply inverting the ratio doesn't change the basic relationship. There's no choice involved here. If you want to see where I'm coming from, read my response to OxDEADBEEF in post 23.
 
  • #30
jimmysnyder said:
Why not take the group {0, 1} under addition as the fundamental mathematical object? The group of integers under addition can be derived from it.

Yes. You can certainly use integers to generate other integers. When I say the positive integers are non-derivative as an object class, I'm arguing that they were not generated by a conscious extension of something more basic. Experiments have shown that even some animals and human babies have some notion of counting. The fundamental theorem of arithmetic states that all natural numbers are prime or products of primes (except 1 and 0). However, there is no accepted way to generate all the primes. If the integers were derivative, we should be able to specify all the properties of the set of natural numbers as an object class.

Although you didn't ask about it, the Set Theoretic definition of a natural number doesn't IMHO change what I just said. The idea of a set as a collection is intuitive, but the empty set is not. I think it's fair for an educated non-mathematician to question aspects of the notion of the empty set. Since there is only one empty set, how can it be nested within itself? Because ZFC says it can. Fine, but the empty set is a very abstract notion created by a conscious act of reasoning which does not follow from any paradigm that I know of.
 
  • #31
SW VandeCarr said:
Experiments have shown that even some animals and human babies have some notion of counting.
Having an instinctive notion of counting is much different than having an instinctive notion of the first few numbers.

Having an instinctive notion of the first few numbers is much different than having an instinctive notion of the first, say, ten numbers.

Having an instinctive notion of the first ten numbers is much different than having an instinctive notion of the class of natural numbers.
 
  • #32
SW VandeCarr said:
The idea of a set as a collection is intuitive, but the empty set is not.
I beg to differ -- I, and many others, find it quite intuitive.

Furthermore, I assert that the only reason many laypeople have trouble with the empty set is because of grammar and other linguistic convention.
 
  • #33
SW VandeCarr said:
However, there is no accepted way to generate all the primes.
:confused: There are lots of was to generate all of the primes, should one want to do such a thing, without even the slightest hint of any controversy surrounding them.
 
  • #34
SW VandeCarr said:
Since there is only one empty set, how can it be nested within itself?
It can't. A set that contains nothing but the empty set is not itself the empty set. It is a set. Consider the power set of a non empty set. The power set contains the empty set.


Use the group: {elephant, violin} with the grunt as the operation:

elephant grunt elephant = elephant
elephant grunt violin = violin
violin grunt elephant = violin
violin grunt violin = elephant

It can be used to derive the group of integers
and it is simpler than the group of integers
and it does not contain integers.
 
  • #35
Hurkyl said:
Having an instinctive notion of counting is much different than having an instinctive notion of the first few numbers.

yesHaving an instinctive notion of the first few numbers is much different than having an instinctive notion of the first, say, ten numbers.[/QUOTE]

yes

Having an instinctive notion of the first ten numbers is much different than having an instinctive notion of the class of natural numbers.[/QUOTE]

yes. Given {0.1, +} we learned to generate all the the other positive integers and prove that the set was infinite. We are smarter than animals and adult humans are smarter than baby humans and mathematicians are smarter than anybody.
 
Last edited:
  • #36
SW VandeCarr said:
I'm arguing that they were not generated by a conscious extension of something more basic.
SW VandeCarr said:
Given {0.1, +} we learned to generate all the on the other positive integers
Just to be clear, you've changed your position on this, right?
 
  • #37
Hurkyl said:
:confused: There are lots of was to generate all of the primes, should one want to do such a thing, without even the slightest hint of any controversy surrounding them.

As usual you're literally correct. However, an algorithm which can be inductively proven to generate all the primes and only the primes?
 
  • #38
Hurkyl said:
Just to be clear, you've changed your position on this, right?

Yes, as regards forming a group. The positive integers do not form a group under addition and subtraction, but one can generate all the the positive integers from N+1=N' where N' is the successor. (Actually I still think non negative integers can form a group with the weak form of subtraction that doesn't allow for negative numbers but I won't argue the point.) If you mean changing my position regarding natural numbers as a non derivative object class, no. As I said, you need the concept of integer in order generate more integers.
 
Last edited:
  • #39
SW VandeCarr said:
Yes, as regards forming a group.
Huh? I wasn't talking about that -- I was talking about the bit I quoted.



SW VandeCarr said:
As usual you're literally correct. However, an algorithm which can be inductively proven to generate all the primes and only the primes?
:confused: If an algorithm didn't generate all the primes and only the primes, I wouldn't have called it such.
 
  • #40
jimmysnyder said:
It can't. A set that contains nothing but the empty set is not itself the empty set. It is a set. Consider the power set of a non empty set. The power set contains the empty set.

A concept that was developed after 5-10,000 years of mathematical experience is not intuitive. Even someone as prolific as Euler almost certainly never thought of it. I actually tried to explain how something that was nothing wasn't nothing because it contained something that was nothing. The victims of my intellectual assault weren't mathematicians, but they were very bright engineers. My phrasing actually comes from them. Set Theory, at the level of ZFC, is mostly used by mathematicians for mathematicians (and logicians).
 
  • #41
SW VandeCarr said:
A concept that was developed after 5-10,000 years of mathematical experience is not intuitive. Even someone as prolific as Euler almost certainly never thought of it. I actually tried to explain how something that was nothing wasn't nothing because it contained something that was nothing. The victims of my intellectual assault weren't mathematicians, but they were very bright engineers. My phrasing actually comes from them. Set Theory, at the level of ZFC, is mostly used by mathematicians for mathematicians (and logicians).
A group is a set.
 
  • #42
Hurkyl said:
Huh? I wasn't talking about that -- I was talking about the bit I quoted.

OK. Then I haven't changed my position. As an object class the integers are not derivative. It takes the concept of integer and addition to generate the object class. (object class in the object oriented programming sense).

Confused: If an algorithm didn't generate all the primes and only the primes, I wouldn't have called it such.

Are you saying such an algorithm or formula exists?
 
  • #43
jimmysnyder said:
A group is a set.

Of course. What does that have to do with the discussion? The integers are a set. Euler wouldn't have known what you're talking about however.
 
  • #44
SW VandeCarr said:
A concept that was developed after 5-10,000 years of mathematical experience is not intuitive.
The concept has been around since antiquity. It has been in studied in philosophy at least as far back as Parmenides. We even have common English words relating to it. Heck, we even have specialized grammar for it.

That the empty type did not appear as a specific object in modern set theory until recently is because modern set theory is a recent invention. :-p
 
Last edited:
  • #45
SW VandeCarr said:
OK. Then I haven't changed my position.
Then you have contradicted yourself -- in the first of the quotes in #36, you state that
they were not generated by a conscious extension of something more basic.​
and in the second quote, you mention how we generate them as a conscious extension of something more basic.
 
  • #46
Hurkyl said:
The concept has been around since antiquity. It has been in studied in philosophy at least as far back as Parmenides. We even have common English words relating to it -- e.g. "nothing".

That the empty type did not appear as a specific object in modern set theory until recently is since because modern set theory is a recent invention. :-p

Exactly. The idea of nothing is well expressed intuitively with zero, but for some reason, Western mathematics didn't adopt it until the Middle Ages (from India via Arabia). However {} in not zero. We have 0, {0} and {}. Now the logical non mathematician might well ask what the difference is between these. I think even Euler might have asked that. I think he might have been even more perplexed by {{},{{}}}. Now with ZFC, it becomes clear what these expressions mean, but it's not intuitive. There's a reason why teaching math starts with the natural numbers and addition. There's cognitive order to the way the brain assimilates mathematics. There's a lot of research along these lines and it's not clear if the historical model of mathematical development is the best and most efficient way for introducing new concepts, but I doubt we'll be seeing {} in the first grade classroom anytime soon.
 
  • #47
Hurkyl said:
Then you have contradicted yourself -- in the first of the quotes in #36, you state that
they were not generated by a conscious extension of something more basic.​
and in the second quote, you mention how we generate them as a conscious extension of something more basic.

No. The object class of integers is basic. Historically and, I believe, cognitively, the integers with addition are basic. The objects in the class are generated as described. I've said this several times. There's no contradiction. The integers, I'm saying, were not the conscious extension of something more basic. However the number 112,985 is generated by a process involving integers. All you need to get started is {0,1,+}. Fractions, on the other hand, were a conscious extension of the concept of integers.

In any case, the first quote from post 36 doesn't show the context, and second quote says nothing about conscious extension.
 
Last edited:
  • #48
SW VandeCarr said:
Exactly. The idea of nothing is well expressed intuitively with zero
Well, yes and no. "Zero" captures an aspect of nothingness in a way similar to how "fifty" captures an aspect of United Statehood.

I think even Euler might have asked that. I think he might have been even more perplexed by {{},{{}}}.
And in the prototypical application of set theory, that makes perfect sense -- how often do you have occasion to treat a type of types of types as an object of study? Laypeople have trouble with logic -- let alone metalogic or metametametametalogic! (i.e. fourth-order logic)

Of course, I doubt Euler would have any difficulty understanding that set in various other applications. For example, I doubt Euler would have any difficulty understanding the hierarchy that set describes (depicted below as a graph), or its application as describing a container containing two containers, one empty, and the other containing an empty container.

<br /> \begin{matrix}<br /> &amp; &amp; \bullet \\<br /> &amp; \swarrow &amp; &amp; \searrow \\<br /> \bullet &amp; &amp; &amp; &amp; \bullet \\<br /> &amp; &amp; &amp; &amp; \downarrow \\<br /> &amp; &amp; &amp; &amp; \bullet<br /> \end{matrix}<br />

The use of this set as denoting the natural number 2 is not meant to help you do arithmetic: it's meant to help you do set theory. The primary application of that construction is to let us use arithmetic facts to do certain kinds set theoretic calculations (and it's pretty darned good at it too). It's most familiar application -- to reduce Peano arithmetic to ZFC -- is a technical argument in model theory, nothing more. (Despite the tendency of people to try and read more into it)

That said, I do think it is rather pleasing construction -- we name the number N by making use of a set of N objects. And we conveniently have a set of N objects at hand: the set of natural numbers from 0 through N-1.

There's a lot of research along these lines and it's not clear
And the research I recall is that our instinctive concept of counting number starts becoming fuzzy around the number 5, if not earlier. I believe there's even a pretty good case that our instinctive notion of quantity only has three categories: 0, 1, and more than one. (Although I suspect that one is the fault of language, not instinct)
 
Last edited:
  • #49
SW VandeCarr said:
Historically and, I believe, cognitively, the integers with addition are basic...The integers, I'm saying, were not the conscious extension of something more basic.

This appeal to what is intuitive, what is derived, is cognitively flakey. And the reason why your epistemology would be better founded in the generality of category theory.

Number and addition would be but an example of the more general mathematical dichotomy of object and morphism. The fundamental entity and its space of actions.

And then the animal/infant research would argue that integers and counting are not a basic cognitive act. Although I know many people, including neuroscientists, have made this claim. All those experiments to "prove" that even newborns and chimps can count.

What is basic to brains, to cognition is dichotomisation - the division into figure and ground, event and context. Indeed, object and morphism. Brains find it very easy and natural to find the one among the many, the signal in the noise. Then with effort, the brain can make a succession of dichotomous identifications and carry in working memory the idea of several entities in several locations.

Two, three, and even four can be seen "at a glance". Get up to five or six, seven or eight, and with training people and chimps can make good guesses. Or switch to a second strategy of serial identification - in effect counting by jumping attention across locations. Smart animals with a lot of training (so not natural but socialised and scaffolded by humans) can mimic counting.

So integers are a derived concept if we are talking about the true cognitive basis of our "mathematical" knowledge.

And so is addition. Kids and chimps can be tested by pouring a squat glass of water into a larger taller one. They will think 1 + 0 < 1. There will seem to be less water when it fills a bigger glass.

Again, this is why it is a mistaken enterprise to hope to build the edifice of maths purely by construction from the bottom up using an atomistic entity like an integer and an atomistic action like addition. The "truth" of mathematics lies in the generality that constrains all maths in all its forms. Which is the reason why category theory is a better route to discovering its fundamentals.
 
  • #50
apeiron said:
This appeal to what is intuitive, what is derived, is cognitively flakey. And the reason why your epistemology would be better founded in the generality of category theory.

I will be happy to defer to you on category theory. I have only the most superficial knowledge of it. Again, I'm not trying to derive math. It's already been done. It's in the historical record of how math was actually developed. I think this has practical application as to how math might be more effectively taught. If I were to use a metalinguisitic approach, I would probably choose set theory, but I've already given my reasons why I excluded set theory. I'm simply asserting, based the historical record, that integers are fundamental without attempting to find the neurophysiologic basis for it. All I want to do is distill the record and make it a basis for teaching math. The very poor state of math education in the US needs some innovation. The "new" math of the 1970's was an abject failure.To say a project of using the actual mainstream evolution of mathematics as a model is flakey, misguided or doomed to failure is to say that mathematics up to and including differential equations is a failure. What I've done so far is no less or more rigorous than the standard textbooks on arithmetic, algebra and analysis. I'm not going any further than that.

The integers are a derived concept if we are talking about the true cognitive basis of our "mathematical" knowledge.

I believe it would be a very illuminating and possibly practical project to find the neurophysiologic basis of mathematics, but it's not my project.
 
Back
Top