Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

I really hate the rigor in math

  1. Apr 10, 2006 #1
    I know this is an old topic discused someway in my post before..but i would like to open a general topic about this subject..as a physicist i really hate rigor in math...perhaps the most important think is that your results make sense or can be tested empirically or provide solutions to your problems, rigor is killing science because you can write a very good an revolutionary article but if the "rigorous" referees don,t agree with your results because they lack on rigor, they put your article down....of course i am not a math genious but..what would have happened if it was applied to Euler and Ramanujan?..i have cited two examples of math geniouses critizied by their lack of rigour but that made really astonishing contributions to math and physics....rigour should not be so important..of course i admit that an article nowadays needs to be written on english to be accesible to every scientist in the world but i think that whereas is clear and math are correct it should be publishable.....

    "They prefer to have a beatiful building..rather than having something valuable inside it"..Quote by J.B.J Fourier, mathematician and physicist developer of Heat equation and Fourier series.
    Last edited: Apr 10, 2006
  2. jcsd
  3. Apr 10, 2006 #2


    User Avatar
    Science Advisor
    Homework Helper

    Without rigor there is no mathematics, without being truly sure of your proof in mathematics it's not a proof at all.

    I imagine many great mathematicians would have rose to the challenge of being rigorous had they lived today. But in modern times with so many people being able to communicate with one another, it is well realised that without rigor in mathematics you can "prove" something that is just not true.
  4. Apr 10, 2006 #3
    But Zurtex..how are you sure..you are right?..i can give you several examples made by physicist:

    [tex] 1+2+3+4+5+.................=(-1/12)_{R}=\zeta(-1) [/tex]

    [tex] 1+2^{3}+3^{3}+4^{3}+5^{3}+.................=(1/120)_{R}=\zeta(-3) [/tex]

    [tex] 1-2!+3!-4!+5!-.....................=\int_{0}^{\infty}dxe^{-x}(1+x)^{-1} [/tex]

    R stands for "resummation" (Hardy in his book "divergent series" and the last is obtained from Borel resummation.. however from the rigorous math point of view all these series sums should be infinite,te relations above were discussed by Euler and others....and although it seems something "spooky" the first 2 series are used for calculations in string theory and Casimir effect giving consistent results.
  5. Apr 10, 2006 #4


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Complaining about rigor in mathematics is like complaining about empirical verification in science or all that physical activity in sports. :tongue:

    Physics is most certainly not being killed by rigor -- they regularly push symbols around in all sorts of abominable ways. As long as it works, they keep doing it and let other people worry about what the symbols mean and why it works!

    No, from the rigorous math point of view, none of the left hand sides are the thing called "infinite summation" that you learned about in your elementary calculus classes.
  6. Apr 10, 2006 #5

    matt grime

    User Avatar
    Science Advisor
    Homework Helper

    Oh, dear God, not again.

    Look, there is a perfectly valid place in mathematics for doing informal unrigorous work. It is how we figure out what we might want to prove. It is suggestive, it points us in the right direction. But, such things are not proofs. And this is where you fall down, Jose: you assert things are correct as if you have proved them when you have not.

    I am absolutely convinced that every compact object in any triangulated quotient of a module category is finitely generated (plus some other restrictions), for instance. Any case I care to look at proves this. The number of cases where I can look is so tiny, vanishingly small in the grand scheme of things. That just means I can;t prove it, but I believe it is true, and I can convince other people that it makes sense heuristically (if it were false it would imply that there were some very strange cohomological properties of the module category that we just don't believe it has). But I cannot prove it. That doesn't make the work I've done worthless, it just means it's not a proof.

    So, put your work in perspective. You've come up with some ideas, now try to work out how to make them into proofs (personally, if I were you, I wouldn't bother; I'd go away and learn what others have done in the area).

    You also should not even try to cite Euler (or anyone who operated before modern conventions) or Ramunajan who was rigorous when he learned what it meant. There is also a large difference between being unrigorous and having intuition that works out properly when people actually check the details (eg Witten). None of your 'insights' have had any evidence put up in support of them to demonstrate why they might be true, and indeed often you assert things that are demonstrably false or indicate a lack of understanding (i.e. no intuition), e.g. your questions on Lebesgue integration.
    Last edited: Apr 10, 2006
  7. Apr 11, 2006 #6
    I noticed that DeMoivre, author of "Doctrine of Chance," simply said the he could not understand what Bernoulli was writing about probability. Certainly DeMoivre* wrote on a rather simple plane, generally about games of chance, and avoided an abstract view. Ramunajan had great talent, but, perhaps, it was quite limited and he could not abstract like his mentor, Hardy.

    As math progressed, I guess, the ability to abstract took on more and more importance, and that may require a different mind-set. I know a student who was lamenting group theory which he found too abstract, and was told, "You would have like math much better a few hundred years ago." I guess that was my case too.

    This may or may not have any bearing on what is being discussed, but clearly some minds work differently than others. We have the case of the Formalists and the Intuitionists. I would expect that a Formalist would have an advantage in N-dimensional Geometry and something like studying the Monster Group in Group Theory, since intution seems a very unlikely way to go.

    Godel was an Intuitionist or Platonist, http://www.ingentaconnect.com/content/tandf/thpl/2005/00000026/00000003/art00002 and believed that mathematical objects had reality all of their own, as real as physical objects. I would imagine such a person would be basically thinking in a less rigorous or abstract way.

    True, I assume, rigor by itself never found out anything new--unless it was about rigor. Strangely enough for the dept of understanding and degree of difficulty and strangeness, Einstein emphasized intuition. Yet despite his greatness, Einstein never had a good word for the relativity version of quantum mechanics know as quantum field theory. It successes did not impress him. Once in 1912, he said of the quantum theory The more successful it is, the sillier it looks.
    http://www.mtnmath.com/whatrh/node107.html I could not imagine any Formalist ever even thinking such a thing. Bertrand Russell once said, 'Mathematics is marks on paper where we don't know what we are saying or whether it is true.'

    Another quote from the above website is: For intellect to proceed in physics it must have or work out the mathematics in some detail. Intuition can play with ideas at a looser level. Intuition can leave the conceptual framework of classical particles that quantum mechanics is trapped in. Without knowing the details it can match patterns and see where connections are possible in a different framework. Of course this process is far more error prone then a more narrow intellectual approach, but for many problems it is the only possible approach.

    *For one reason or another, it is my understanding DeMovire was never made a member of the Royal Society. Yet Newton once said, "Go ask Mr. DeMovire, he knows more about it than I do."
    Last edited: Apr 11, 2006
  8. Apr 11, 2006 #7

    matt grime

    User Avatar
    Science Advisor
    Homework Helper

    I don't think platonism or formalism or intuitionism has any bearing on mathematical ability or preference for rigour in maths. It is impossible to tell from a person's mathematics what their philosophy is (unless they explicitly mention it).

    Intuition in mathematics is important, and many great mathematicians have relied on it and often it has turned out that their beliefs were correct. But how do you convince someone else of the correctness of your beliefs? The simplest way is to prove it. The only other way is for your beliefs to keep being proven correct by other people (a la Witten).

    It only takes one counter example to disprove something, that is why we need to be rigorous: to make sure we don't make mistakes.

    What is the moral of the story? For me it is that intuition and rigour each have their place and each say different things. If you are claiming a proof you need rigour, if you are discussing possibilities then intuition is all you need to invoke. There are many conjectures in maths that are intuitively true in some sense, but we can't prove them. That doesn't stop most people being convinced that they are true without needing a rigorous proof. Note that intuition is something that tends to be developed. It is perfectly possible to have a lot of intuitive thoughts about the Monster Group: it is simple, therefore I know a lot about what its character table might look like.

    I would rather put forward another dichotomy:

    Big Picture thinking and Small Detail thinking.

    The former is somewhat closer to the intuitionist ideas you're stating: it emphasizes large scale behaviour and ignores small numbers of exceptions, it doesn't bother to work out numbers, and leaves things as 'obviously true'.

    Small Detail thinking is dotting the i's crossing the t's, if you will. It is making assertions like 'for n >4 and the bound is sharp' compared to 'for n big enough'.

    Most people start off doing the latter to get an understanding to allow them to do the former.
    Last edited: Apr 11, 2006
  9. Apr 11, 2006 #8
    -A rigorous proof does not always implies a correct proof...take a look to E. Rips paper in wich the "Bible codes" are inspired, the reasoning is correct but the results are simply a nonsense, of course he was famous so his paper was published.

    -By the way how do you proof with your math "rigor" that [tex] 1+1=2 [/tex] of course i mean without taking it as an axiom, or proving it empirically.

    -I cited Ramanujan and Euler as some examples of my affirmations...they both did not consider much the rigor in their papers in fact if were not for Hardy,s help (i wish some teachers in my univeristy gave me also a chance,not in math but in physics) he would have never published anything on a journal and would have starved to death.

    -How do you justify "infinitesimals" (i know that nowadays there,s a rigourous theory about them..but how about in XVII-th century?) take a value dx being the smallest positive number then (dx)/2 is smaller contradictions infinitesimals can,t exist so calculus can,t exist so why use?..the same arguments were used to ban fourier series.
  10. Apr 11, 2006 #9


    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    Dearly Missed

    Well, eljose, how would you introduce the symbol 2 into the mathemtaticians' tool-box without a prior defining statement of it?

    Before the symbol 2 has been (axiomatically) defined to mean SOMETHING, you can't prove anything with it.
  11. Apr 11, 2006 #10


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    That's incorrect -- a (completely) rigorous proof must be a correct proof: to be completely rigorous, it must start with the hypotheses, and each step follows logically from the previous steps (with nothing left implicit). That is essentially the definition of a correct proof.

    Of course, if the hypotheses are false, then even if we have a correct proof, that does not imply the conclusion is true.

    I don't know much about this particular scenario. But based on what little I do know, he makes a statistical claim, but without following any of the proper procedure that one is supposed to use in statistics.

    If possible at all, it would depend on the definitions of "1", "+", and "2", and whatever axioms you did choose.

    For example, in the theory of the natural numbers, 2 is generally defined to be the successor of 1 (Which is defined to be the successor of 0). The addition operation is defined recursively by: (using s(a) to denote the successor of a)

    a + 0 = a
    a + s(b) = s(a) + b

    And the proof that 1+1=2 is a one-liner:

    1+1 = 2+0 = 2

    Of course, in other contexts, 2 is simply defined as 1+1. After all, it needs to be defined as something. Can you think of a better choice?

    In another context, the ordinal 1 is defined to be the ordering consisting of a single element *. The ordinal 2 is defined to be the ordering consisting of two elements: * < *. u + v is defined, in this context, to mean that you concatenate u and v, and put a < between them. So, 1+1 is:

    (*) < (*) = * < *

    and thus 1 + 1 = 2.

    And in still other contexts, 1+1=2 is false. For example, in category theory, 1 is (by definition) the diagram that looks like this:


    1+1 is the diagram that looks like this (as computed by the definition of + as the coproduct): (pretend the white stripe isn't there)


    2 is the diagram that looks like this (by definition)


    So 1+1 is clearly unequal to 2.

    So I hope you see why definitions are important. :tongue: Without definitions, you can't prove anything at all!

    These are wonderful examples of why rigor and the axiomatic method are useful!

    Newton criticisms that they were pushing symbols around in strange and mysterious ways, and there was no particular reason they should be getting any sort of sensible results at all.

    But if Newton could have rigorously derived calculus, then such criticisms would have held no weight. People would have to accept the validity of his calculus -- but without the force of fully rigorous logic, he had to rely on people sharing his intuition, and on its outstanding empirical success.

    I would say the same about Fourier, but my memory of history is fuzzy -- I think he did eventually manage to prove that his method works, and thus it was accepted by the mathematical community.

    Anyways, I've warned you before about griping about how nobody listens to you, so this is strike #1.
  12. Apr 11, 2006 #11

    matt grime

    User Avatar
    Science Advisor
    Homework Helper

    'correctness' in a mathematical proof is (now) that the argument is a rigorous series of steps that are all mathematically sound and justifiable.

    the arguments are not *mathematically* sound: it is easy to demonstrate rigorously and mathematically why the bible codes are nonsense.

    that is highly speculative and unjustified. if Hardy (or someone else) had not seen a spark of genius in ramunajan's writings then he would have stayed in his menial job in India. you really do yourself no favours by comparing yourself to these people and whining about these things.

    who says dx is the smallest real number?

    there *are* problems with fourier series. it is easy to write down to different functions with the same fourier series (eg the zero function on [0,1] and function whose support is a finite subset of [0,1]) and fourier's mistake (along with most undergraduates) is to ignore this fact. and now we have a rigorous foundation for analysis we can explain what this means in terms of measurability and l_p spaces.
  13. Apr 11, 2006 #12
    eljose: By the way how do you proof with your math "rigor" that of course i mean without taking it as an axiom, or proving it empirically.

    For 1+1 we invented the symbol "2", no? But check out the last quoted part of this entry...

    How do you justify "infinitesimals" (i know that nowadays there,s a rigourous theory about them..

    Rigourous theory does not always mean they are justified, take surreal numbers:
    Although they permit most usual mathematical
    operations like addition, raising to powers and taking logs, it's
    proving hard to integrate with surreals - crudely, to add up
    infinite amounts of infinitely small quantities and get sensible
    results. Without integration, surreals lose much of their interest
    in physics,

    When it comes to a+b=b+a one way to check that out is to go to Walmart and ask the cashier about it. I know that if you pay for beer before you pay for pop, the total is the same as the other way around. If you don't trust the cashier, go as the accountant. I guess it is the same even on Mars. WHAT IF IT WASN'T?? (That's an awesome thought!)

    hurkyl: But if Newton could have rigorously derived calculus, then such criticisms would have held no weight.

    Historical note: Newton did not publish about the Calculus, but waited for Libnetz to do so. Newton's implicit contribution was advanced and defended by English mathematicans.

    At one point Peano did not specify, in some way, that one is not equal to 0, and 0 could be the whole system. There is the matter of uniqueness of the system and whether it is what we are really looking for: a postulate system cannot be regarded as a set of "implicit definitions" for the primitive terms: The Peano system permits of many different interpretations, whereas in everyday as well as in scientific language, we attach one specific meaning to the concepts of arithmetic. Thus, e.g., in scientific and in everyday discourse, the concept 2 is understood in such a way that from the statement "Mr. Brown as well as Mr. Cope, but no one else is in the office, and Mr. Brown is not the same person as Mr. Cope," the conclusion "Exactly two persons are in the office" may be validly inferred. But the stipulations laid down in Peano's system for the natural numbers, and for the number 2 in particular, do not enable us to draw this conclusion; they do not "implicitly determine" the customary meaning of the concept 2 or of the other arithmetical concepts. And the mathematician cannot acquiesce in this deficiency by arguing that he is not concerned with the customary meaning of the mathematical concepts; for in proving, say, that every positive real number has exactly two real square roots, he is himself using the concept 2 in its customary meaning, and his very theorem cannot be proved unless we presuppose more about the number 2 than is stipulated in the Peano system. http://www.meta-religion.com/Mathematics/Philosophy_of_mathematics/nature_of_mathematics_2.htm
  14. Apr 12, 2006 #13


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    I was referring to the famous "ghosts of departed quantities", and similar criticisms. I didn't mean to imply his ideas were universally scorned, nor that Leibnitz had nothing to do with any of this.
  15. Apr 12, 2006 #14
    Hukyl: I was referring to the famous "ghosts of departed quantities", and similar criticisms. I didn't mean to imply his ideas were universally scorned, nor that Leibnitz had nothing to do with any of this.

    As far as priorities I have found this: The Calculus Priority Dispute
    Newton had the essence of the methods of fluxions by 1666. The first to become known, privately, to other mathematicians, in 1668, was his method of integration by infinite series. In Paris in 1675 Gottfried Wilhelm Leibniz independently evolved the first ideas of his differential calculus, outlined to Newton in 1677. Newton had already described some of his mathematical discoveries to Leibniz, not including his method of fluxions. In 1684 Leibniz published his first paper on calculus; a small group of mathematicians took up his ideas.

    - Bishop Berkeley, -""The Analyst: A Discourse Addressed To An Infidel Mathematician: (1734)and what are these fluxions? The velocities of evancescent increments And what are these same evanescent increments? They are neither finite quantities, nor quantities infinitely small, nor yet not& ing. May we not call them the ghosts of departed quantities... ?

    I guess Hurkyl, you are correct about this. Evidently Bishop Berkeley is refering to work of Newton. Newton died in 1727, so Berkeley is writing at a later date.
  16. Apr 12, 2006 #15


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    I think the author is trying there to say that Peano's axioms don't talk about what natural numbers "are" -- if we want to use the natural numbers (as defined by Peano's axioms) to talk about things such as the cardinality of a set, that requires some further machinery. (i.e. a "model")

    This is a good thing, incidentally -- it means that we can apply the theorems of Peano arithmetic to any structure that satisfies the Peano axioms (such as the class of finite cardinal numbers).

    But I think that the author is vastly underestimating the ability of a mathematician to make things rigorous when desired or necessary. We communicate with our vague natural language because it's efficient -- not because we don't know of a more precise way! And economy of thought is a good guiding principle -- if you don't need to introduce complications in order to understand something, then don't!

    So a mathematician is content to say "every positive number has exactly two square roots" because any other mathematician knows of at least one way to write that statement precisely. (e.g. the set of square roots of any positive number is bijective with the set {0, 1}) And the precise statement is what's actually proved and used.
  17. Apr 12, 2006 #16
    -Hurkyl..i was just giving my opinion about university teachers about not giving me a chance...if you can critize nobody into an open forum this is "censorship"..i think a forum should be an open place..by the way if you think (although i would like to hear another people,s opinion) why i don,t see at arxiv.org or any reviewed journal (for example) "number theory....." by "mr. xxxxxx" "undergraduate student with no affiliation to any famous or well-posed-considered university"???..or if Ramanujan,s paper were so good..why did only publish them when got the Hardy,s aid?...

    -Well, when it comes to the subject..(sorry for the incise above), of course now the things i have pointed are seriously and rigourously treated
    infinitesimals, fourier series, asymptotic expansions...but at their time people used them without rigor and even without knowing if it was legitimate or not....they thought "if they give correct results..why not use them?"...infinitesimals were used by more than 100 years being "ghost of departed quantities",i think journals should be more permisive with the rigor of the equations if they lead to correct results...
  18. Apr 12, 2006 #17

    matt grime

    User Avatar
    Science Advisor
    Homework Helper

    i don't believe the problem was with criticism per se but the fact that your criticism is unfounded, unwarranted, and repetitive.

    because you don't look hard enough: there are many papers and articles by people not at universities. gregory chaitin was one such name mentioned here recently.

    can you drop this historical crap? for a start the landscape of journal publication (of which arxiv is not one) has changed dramatically to make considering this pointless. You also do not do yourself any favours because according to you ramamujan did something you think is impossible.

    the reason why arxiv has its policy (and i repeat arxiv is not a journal) is because otherwise it would have crackpot nutcases depositing garbage like this forum used to do before it stopped allowing space to people like Doron.

    and people still do use them without requiring rigorousness, but you cannot prove things about them without being rigorous.

    Oh, please, you're talking about maths journals, they have these standards and they do not stop the suppression of correct results. How can you know they're correct if you cannot prove it rigorously?

    Here, draw some points on the circumference of a circle, join them all up, how many segments do you get? for 2 points you get 2 segments, for 3 you get 4, for 4 you get 8 for 5 you get 16, thefore for n you get 2^n, right? I mean just look! you must do.....

    But, you can easily imagine drawing a large circle with 100 points on the perimiter and all of them joined up, yet there wouldneed to be 2^{100} segments which would have to be tiny, miniscule too small to be what happens.

    neither is a proof, but which is correct?

    A simple rigorous proof is to find an n where it doesn't hold, and that is quite easy. You can't argue with that. And that is the point, that is why we like rigour.

    Journal publication is just one aspect of mathematics. we all do unrigorous mathematics all the time precisely because it gives the right answer. however if we want to publish our result and say 'it is correct' then how can you do that without a proper proof? In short they exist to publish correct, rigorous proofs. Sometimes they erroneously publish results where the proof turns out to be wrong and the papers are retracted, or corrected, thaht's how it works.

    Now, stop whinging, and try to use the correct standards at the correct time. There are plenty of outlets for discussing ideas, and things that you think ought to be true but can't prove properly, they are not (in the main) research journals. Sometimes, though, they do publish things that are not rigorous proofs for many reasons.

    Perhaps, and I know this is hard for you to accept, you might want to consider that your unrigorous arguments are wrong and betray a lack of understanding of the subject?
  19. Apr 12, 2006 #18
    I remember reading your arguments with doron, they were hilarious. I especially liked how he'd make several usernames and comment on his own threads saying he agreed with himself.
  20. Apr 12, 2006 #19


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    And that's exactly what I've forbidden you to do -- you've been given plenty of latitude, but it's become quite clear that you're main purpose is to simply lash out at the people who are not giving your work the lavish praise it deserves.

    And while I tend to be very lenient and slow to act, when I say no, I mean it. This is strike #2.

    Just to make it crystal clear: you're not receiving warnings because you're (alledgedly) levying criticisms. You're receiving warnings because you are complaining that people are rejecting your work and seem to give absolutely no thought to the possibility that your work is worth rejecting.

    For the time being, I'm even willing to let this discussion continue, as long as you don't explicitly start griping about being rejected. (Of course, it was clear from post #1 that this is implicitly what you were doing)
  21. Apr 12, 2006 #20
    -I would like to read the threads made by the so-called "Doron"..before giving my opinion about them, the question about math journals..i "proposed" (not exactly rigorous ...) an HIlbert-Polya operator that could give the roots of [tex] Z(1/2+is)=0 [/tex] in a thread before of course Matt-grime said my proof was not rigourous, and perhaps in a sense he was right..my reply is "then take the potential solve the Hamiltonian and obtain the complex part s of the root of [tex] Z(1/2+is)=0 [/tex] if they are approximately the same as the obtained roots then my potential and operator are correct (although with no much rigor ) so there,s a way to proof if i,m wrong or not...just solve the Hamiltonian and obtain its eigenvalues...
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?

Similar Discussions: I really hate the rigor in math
  1. Is i Really Imaginary? (Replies: 11)