Rigorization of analysis and calculus in the 19th century and since

  • Context: Graduate 
  • Thread starter Thread starter titaniumx3
  • Start date Start date
  • Tags Tags
    Analysis Calculus
Click For Summary

Discussion Overview

The discussion centers on the emphasis on the rigorization of analysis and calculus during the 19th century and beyond. Participants explore motivations for increased rigor, historical examples of mathematical errors due to lack of rigor, and the evolution of definitions in mathematics.

Discussion Character

  • Exploratory
  • Historical
  • Debate/contested

Main Points Raised

  • Some participants suggest that Fourier's papers highlighted inconsistencies that prompted a need for rigor in mathematics.
  • One participant notes that the definition of limits was initially intuitive and poorly defined, leading to various interpretations that complicated proofs in analysis.
  • Another participant mentions that the rigorization process was gradual, with figures like Cauchy and Weierstrass advocating for more meticulous approaches, though their efforts were not immediately recognized.
  • It is pointed out that Cauchy made mistakes in his theory of functions due to a lack of distinction between uniform and pointwise convergence, which Weierstrass later addressed.
  • Participants discuss the spelling variations of "rigor" and "rigour," noting that both are acceptable depending on regional usage.

Areas of Agreement / Disagreement

Participants express a range of views on the motivations and implications of rigorization, with no clear consensus on the primary reasons or the impact of specific historical figures.

Contextual Notes

Some limitations in the discussion include the dependence on historical context and the varying interpretations of mathematical concepts that were not rigorously defined at the time.

titaniumx3
Messages
53
Reaction score
0
Why was there such a huge emphasis on rigourisation of analysis/calculus during the 19th century and onwards? What was the key motivation for increased rigour?

In particular are there any explicit examples of mathematics "going wrong" due to a lack of rigour?


Please feel free to share your knowledge and opinions. :smile:
[BTW, is "rigourisation" even a word? I've also seen "rigorisation" too and sometimes the "s" is a "z"; which one is it?]
 
Last edited:
Physics news on Phys.org
One of Fourier's papers was a catalyst - some stuff in it I believe didn't make sense to the mathematicians of that time. There was a thread about this topic not too long ago, I think.
 
It started with the definition of the limit, which before was not properly defined. It was just thought of intuitively. From the limit, proper definitions of many things came, like continuity and differentiability. I'm not sure if this was the main motivation of this process (usually called the arithmetization of calculus) but this was definitely one, because it finally enable mathematicians to overcome a big problem. With only intutive notions and no proper definitions, many theorems in analysis could not be proved well because many people had different intuitions, and different interpretations of the same thing appeared to make them different. With these proper definitions, they could finally define what they were trying to prove, which is usually a good thing!
 
My impression is that it is a process that happened slowly. Certain people felt the need for more rigor (Cauchy, Weierstrass) but the value of their meticulousness was not recognized right away.

I love this quote from the first page of my Fourier Analysis textbook,

"M Cauchy announces that, in order to conform to the Council's wishes, he shall no longer give, as he had done up until now, perfectly rigorous demonstrations." Council of instruction of l'Ecole Polytechnique, november 24 1825
 
"American" English uses the spelling "rigor", while "British" English uses "rigour". As far as I know, "rigorisation" and "rigourisation" are both perfectly good words and which one you use depends on which side of the pond you are on.
 
Last edited by a moderator:
Cauchy made several mistakes in his theory of functions since he had not distinguished properly between uniform and pointwise convergence. Weierstrass patched this together.

And yes, it IS a difference from doing maths and doing nonsense.
 

Similar threads

  • · Replies 13 ·
Replies
13
Views
5K
  • · Replies 13 ·
Replies
13
Views
11K
  • · Replies 5 ·
Replies
5
Views
7K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 8 ·
Replies
8
Views
4K
  • · Replies 13 ·
Replies
13
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 5 ·
Replies
5
Views
5K
  • · Replies 4 ·
Replies
4
Views
4K
  • · Replies 2 ·
Replies
2
Views
3K