Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Conjecture:fundamental mathematical group

  1. Aug 6, 2009 #1
    I'm posting here because I can't prove what I say, but I believe I can defend it on a case by case basis. I hope to attract math oriented members to challenge me.

    The mathematician Leopold Kronecker said "God made the integers; everything else is the work of man." http://en.wikipedia.org/wiki/Leopold_Kronecker

    Well, I'm not speculating on the source of the integers but I am proposing the following:

    1. There is a group which is fundamental to all mathematics. Everything else is derivative. I'm excluding Set Theory and Logic, and considering geometry to be derivative because it can be expressed algebraically and algebra is derivative.

    2. The objects of the group are the non-negative integers and the irrational numbers.

    3. The operation over the group is addition with its inverse, subtraction.

    4. The identity element is zero.
    Last edited: Aug 6, 2009
  2. jcsd
  3. Aug 7, 2009 #2
    What do you mean by excluding set theory? Did you ever look into ZFC you can derive your numbers from that, I'd be surprised if you could derive ZFC from your numbers. With your operations [tex]\mathbb{N}[/tex] is not a group by the way [tex]\mathbb{Z}[/tex] is. Why do you even think that [tex]\mathbb{N}[/tex] is somehow fundamental when it doesn't even allow for a decent subtraction....
  4. Aug 7, 2009 #3
    My purpose is not to reconstruct mathematics from ZFC. My purpose is to distinguish fundamental or "natural" objects from defined or "invented" objects. This is the philosophy board. I want to know the minimum number of object classes and operations that would be sufficient to define as much of mathematics as possible as it might be defined to a computer or to describe how mathematics might have evolved. For example: multiplication is serial addition. Division is serial subtraction with remainders or the inverse of multiplication. Algebra is a generalization from arithmetic with constants and variables. Kronecker thought it could all be done with integers only. I included the the irrationals since they can only be approximated by fractions and they help with defining limits (and they are "natural" ie discovered). I've actually done this on a computer just to see how far I could take this. What I really want to hear is someone who can show what part of mathematics cannot be defined from only what I specified.

    EDIT:You are correct in that I do not have a group. I originally had "integers" but changed to "non-negative integers" because negative numbers aren't "natural". With all the integers, the group is closed under addition and subtraction.
    Last edited: Aug 7, 2009
  5. Aug 7, 2009 #4


    User Avatar

    Staff: Mentor

    Just a reminder "that our policies for discussion of science and mathematics hold just as strongly in the Philosophy Forums as anywhere else on the site. Overly speculative or incorrect statements within the domains of science and math may be moved, locked, or deleted at the mentors' discretion, and warnings may be issued. In general, there is more legroom for speculation in philosophical discussion, but it must be in the form of a well motivated question or argument, as described above. In particular, even a 'speculative' argument should be logically consistent with well established scientific knowledge and theory."
  6. Aug 7, 2009 #5
    OK. You read my response to OxDEADBEEF. If you still think this should be removed, please remove it. I believe the development of mathematics from some basic conceptions such counting, the invention of fractions and the discovery of the irrational numbers is a legitimate subject for this forum. So would George Lakoff I'm sure:

    Last edited: Aug 8, 2009
  7. Aug 7, 2009 #6
    The positive integers are not stable by subtraction.
  8. Aug 7, 2009 #7
    I understand. It's also necessary to include all integers to define a group closed under addition and subtraction. I can't modify my original post, but I accept this argument and have already indicated this to Oxdeadbeef (post 3). Thanks
  9. Aug 8, 2009 #8
    Well the way that you are trying to take, is not very modern and ignores that we have found five lines to express the whole content of the natural numbers:

    # There is a natural number 0.
    # Every natural number a has a natural number successor, denoted by S(a). Intuitively, S(a) is a+1.
    # There is no natural number whose successor is 0.
    # Distinct natural numbers have distinct successors: if a ≠ b, then S(a) ≠ S(b).
    # If a property is possessed by 0 and also by the successor of every natural number which possesses it, then it is possessed by all natural numbers.

    Not so miraculous. The question, what the minimum set of things and relations between them is, so you can construct all mathematics from it, is addressed more or less successfully with Touring machines.

    The main part of mathematics has nothing to do with calculating values of functions, but with proofs. Most people only come in contact with calculating, and in that area pretty much everything is based on addition, multiplication and natural numbers (because we only have finite space on the paper) But from a mathematicians point of view this is a very limited part of mathematics.

    So to put it in a more defining manner: You claim set theory follows from algebra, which follows from natural numbers. For this claim to hold, everything you write down to prove it, must be nothing but numbers addition and multiplication, NO TEXT! You surely cannot do this. With ZFC we can.
  10. Aug 8, 2009 #9
    Exactly. That's the whole point. People were using natural numbers long before the present concepts of rigor were developed. This thread is about evolution of mathematics as a process of invention and discovery. One of the distinguishing features of mathematics is that it rarely dispenses with concepts once they are established, unlike the empirical sciences. I began this thread by quoting Kronecker's assertion that only the integers were "natural" and the rest of mathematics was invented. I originally had planned to base the group on the integers and irrationals (which I think Kronecker should have included since we in no way "invented" them). However I changed it at the last minute because the negative integers were a more recent innovation apparently originating in an number of places.


    They seem to have originated with businessmen looking for ways to express debt. In any case, someone suggested you could subtract seven from three and get -4. Today we regard addition with negative integers and subtraction as distinct operations, but it didn't start off that way. It soon became apparent that you take odd roots of negative integers, but not even roots. These simply didn't exist. The invention of several Italians (Niccolo Tartagalia and others) to write [tex]{i}[/tex] for [tex]\sqrt{-1}[/tex] was suspect for centuries after. You see were I'm going. It's not so much the history of math as the evolution of mathematical thought as model of cognitive extension. That is, how paradigms of thought may be extended or replaced.

    Euclid's ideas regarding the axioms of plane geometry were long admired, but not copied until attempts were made in the 19th century. Cantor's highly original Set Theory was not in the historic line of mathematical development at all. Before this, there was only the rather vague idea of classes. It's well known that Set Theory was not accepted right away, and Cantor suffered for it. Set Theory dispenses entirely with traditional and historic mathematics and uses more basic objects and relations. It's because of this that it became a useful metalanguage for formulating definitions of mathematical objects previously assumed to be primitives. I excluded it for this reason. I think Set Theory lies somewhere between traditional number based mathematics and logic.

    My basic question is what mathematical objects are discovered and what objects are invented? It's obviously a philosophical question, not a mathematical one. I expect opinions will disagree. Also, given certain discovered objects, how far can pure innovation take us?
    Last edited: Aug 9, 2009
  11. Aug 8, 2009 #10


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    I'm not really sure in what sense you think your group is "fundamental".

    You seem to make a point to talk about "irrational numbers" despite the fact you have no way to talk about the concept of irrationality.

    e.g. your group is isomorphic to the group of all polynomials with rational coefficients

    Another thing missing -- you don't seem to have any means to invoke mathematical induction

    P.S. you made a (minor) omission: -- your group must contain the rational numbers, otherwise it's not closed under addition)
  12. Aug 8, 2009 #11
    I wrote this to invite comment so I suppose you could argue what's fundamental. I was thinking that fundamental objects are discovered and derivative objects are invented. You could argue just what invented and discovered means, but I think Pythagoras did not want to discover that he couldn't express [tex]\sqrt{2}[/tex] as a fraction.

    When you talk about my group, it's only a group if I include all the integers. I had originally wrote it this way, but made a last minute change (unfortunately) because I think of the negative integers as derivative (invented by commercial lenders apparently). I think if you consider the group's objects as the integers and irrationals, it works. The rationals were an Egyptian innovation (Egyptian fractions).

    You posted before I finished the previous post. I hope it clarifies what I'm talking about. If not, please ask.

    EDIT: All integers can be reached by addition and the group of integers is closed under addition, but I think I see your point about the irrationals. It looks like I'm going to have to give up any idea that I have a group, because I was reluctant to include the negative integers and now I have to give up the irrationals unless I take the rationals too. The former are clearly discovered and the latter clearly invented (to my way of thinking).

    From from now, I'm just going to say that the natural numbers and the irrationals were discovered and all else were invented. The idea that I can make or not make a group out of this doesn't really affect my philosophical argument.
    Last edited: Aug 8, 2009
  13. Aug 8, 2009 #12


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Have you read Wikipedia's article on negative numbers?

    No: there exist two irrational numbers whose sum is 1/2.

    By the way, I fibbed earlier -- I had implicitly assumed you meant only the real algebraic irrational numbers, not all irrational numbers. But the point I was making still stands: you can't do much at all if all you have is real numbers and addition. I'm 99.985% sure you can't even define the notion of "integer"!
  14. Aug 8, 2009 #13
    I did now. Apparently the Chinese were more sophisticated than the Europeans (no surprise there). But in most cases, according to the article I cited, merchants and money lenders were the probably the first to use negative numbers. By the way, there's some sharp division of opinion on the use of the Wiki. It's described by some as the source that "shall not be mentioned". I think each article needs to be taken on a case by case basis. A well referenced article with no editorial comments should be as good as any. This is stuff for another thread. Where does it belong?

    I also understand there's some real non-zero rational power of [tex]{e}[/tex] that's an integer (or so close it's difficult to tell it isn't an integer). All the more reason to consider the irrationals to be discovered (ie "fundamental") and for "discovered" to be meaningful word.
    Last edited: Aug 8, 2009
  15. Aug 8, 2009 #14


    User Avatar
    Gold Member

    Philosophically speaking, I would suggest you are making a basic mistake. You want to define some fundamental mathematical atom - the integer - and construct your way up from this "natural object" to every "derived object" know to maths.

    But with the switch from set theory to category theory, it seems that a better way of making progress is to generalise. Which means moving from the local and particular towards the global and universal by the systematic removal of constraints. So for example the shift in level from geometry to topology. Geometry works with specified distances and curvatures. Topology throws away such constraints - measurements added to specify - and so sees reality in a more generalised fashion.

    In number theory, you have a similar journey from the original number line, through imaginary numbers, and then increasingly higher dimensional algebras. Specific operations like division cease to work. A more generalised or "topological" view of arithmetic emerges.

    So the philosophical question is whether you can start with your event and construct your way up to a view of the context? Can you start with an atom and construct your way up to find the void? Or can you find that larger void, larger context, only by the successive and systematic relaxation of constraints. Relax what is forming the crisp atomness to discover the larger world of which it is a particularly pinched-up and bounded-on-all-sides part.

    A focus on integers is of course a healthy thing philosophically speaking as we would want to know the minimal set of constraints (as in peano axioms for instance) that will serve to "produce" them.

    So the challenge seems to be not, assume the integer and off we go.... But instead what minimal collection of constraints serve to make integers possible?

    In this spirit, my own take on irrational numbers has always been that the whole numberline is irrational as it is a continuous line, a geometric object. Then we add constraints to create rationals.

    So the number 1 is irrational. It is 1.000...... And we can keep specifying extra zeros infinitely. Indeed we would need to to be sure that what we thought was exactly one does not go squirrelly on us and turn out to be 1.000.....1 or some other location on the numberline.

    For the sake of argument, we can axiomatically take 1 to be 1. But philosophically, we can see how it is also the result of a process of applying a constraint on free dimensionality.
    Last edited: Aug 8, 2009
  16. Aug 8, 2009 #15
    Yes... [itex]e^0=1[/itex] or [itex]e^{i\pi}=-1[/itex]. Infact there are infinitely many of them. There is a lot of (in my opinion, useless) debate about maths being discovered or invented. I suppose by discovery you mean that it is something that is in nature that would lead to such construction. I myself have never seen an irrational number in nature, partly due to the fact that humans cannot observe infinity. Construction of an irrational number would need the axiom of infinity (which states that an infinite set exists), which I find debatable as something that is in nature.

    Even worse, irrational numbers are uncountable. How on earth could you say that this is natural?
  17. Aug 8, 2009 #16
    Actually, I was thinking it was a real rational power. I've gone back and edited that.

    Regarding discovering irrationals, Pythagoras was surprised and apparently not too happy in discovering them. They certainly weren't in any sense invented. Number theory is all about discovering new properties of integers, particularly primes. Research in NT has a resemblance to research in the physical sciences in that you don't know what you might find. If you don't like "natural" just say "discovered" or not invented

    Yes, we can never write out an irrational number, but we have finite algorithms for calculating them to whatever precision we need up to technical limits. Correct me if I'm wrong, but I understand the published algorithms for pi and e have been proven to converge.
    Last edited: Aug 8, 2009
  18. Aug 8, 2009 #17


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    x, where x is a variable whose domain is R\Q
  19. Aug 8, 2009 #18
    Pythagoras was long dead when they came across irrational numbers. I am certain that we will not observe something infinite, as we can only exist and observe for a finite time, thus all we can observe is something that is very big.

    My concern is not about writing out irrational numbers. Just call them x,y. It doesn't make a difference. It is the fact that there are uncountably many of them. I highly doubt that uncountably infinite is something that people will say is natural (in the sense of nature), though I admit they may be natural in mathematics. It is natural, assuming ZFC that we construct the naturals. Then knowing a bit about groups it is natural to construct integers. Knowing about fields it becomes natural to construct rationals, and using completion of a metric space or Dedekind cuts, it is natural to make the reals.But do you actually claim that ZFC was discovered/natural? Where is evidence of, say the axiom of choice or axiom of infinity in nature?

    Also if indeed pi and e did converge, they would not be irrational numbers. Who cares if we have algorithms to compute them? They are just elements of a set.
  20. Aug 8, 2009 #19
    You can approximate a number like [tex]\sqrt{2}[/tex] by a sequence which converges to a number which when squared converges to two. For pi and e, there is convergence even if we can't specify the quantity exactly in our number system. These are just the kinds of things I'm calling non-derivative. The rationals are derivative because they can be expressed as ratios of integers. Integers are non-derivative.

    Set theory is a metalanguage for describing mathematics. I never said ZFC is "natural". Just the opposite. Set Theory was an invention of Gorg Cantor and made rigorous by ZFC (although the choice axiom is suspect for some mathematicians including Penrose). We use languages to describe the world but the description isn't the world.

    I'm also puzzled by your considering uncountable infinity with somehow being "unnatural". I don't see the logic here. If that's what we see, that's what we get.
    Last edited: Aug 9, 2009
  21. Aug 9, 2009 #20
    Not really. I'm hardly up for such an ambitious project. I'm saying history has constructed the mathematics we have today. This construct, I'm saying was based on a few discoveries and a lot of innovation. Among the discoveries are the integers and irrational numbers. At one point, one could have said that the elements of geometry (when there was just one geometry) were discovered, but it is now possible to formulate geometries as algebras.

    This just shows how much mathematics is subject to innovation and fashion. Nevertheless, there is a difference (I believe) between 1.00000...... and 1. The former might be taken as just one of an uncountable infinity of real numbers (I don't think you can say just irrational numbers), the latter as just one of a countable infinity of integers.
    Last edited: Aug 9, 2009
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook