lurflurf
Homework Helper
- 2,459
- 159
Do not confuse detail with rigor. An approximation can be rigorous. Often when we wish to establish some fact C and we observe that A->B->C. It may happen that A is difficult to establish and may be false, while B is easy to establish. It is not more rigorous to prove A, proving A or B proves C. In fact proving B first may make it easier to prove or disprove A, or maybe A is unimportant.
Mathematics in general and calculus in particular often involves approximations. For example
$$\lim_{x \rightarrow 0}\frac{\sin (x)}{x}=1$$
is a fancy way to say sin(x)/x is approximately 1.
If we accept dy and dx together in dy/dx it is silly to reject them when they appear apart.
Saying dy/dx is "more rigorous" than dx is like saying 3/4 is "more rigorous" than 4. In fact dy/dx is less general as it assumes division has been defined, is possible, dx is not 0, and our interest is in the ratio. Often these conditions are not met.
We can use dx and dy when we move to vectors while dy/dx requires strange trickery to work.
Books often include warning about avoiding extremely stupid yet common errors. Introduction to Calculus and Analysis, Volume 1 by Richard Courant and Fritz John warns "We emphasize that this [the differential] has nothing to do with the vague concept of 'infinitely small quantities.'"
Mathematics in general and calculus in particular often involves approximations. For example
$$\lim_{x \rightarrow 0}\frac{\sin (x)}{x}=1$$
is a fancy way to say sin(x)/x is approximately 1.
If we accept dy and dx together in dy/dx it is silly to reject them when they appear apart.
Saying dy/dx is "more rigorous" than dx is like saying 3/4 is "more rigorous" than 4. In fact dy/dx is less general as it assumes division has been defined, is possible, dx is not 0, and our interest is in the ratio. Often these conditions are not met.
We can use dx and dy when we move to vectors while dy/dx requires strange trickery to work.
Books often include warning about avoiding extremely stupid yet common errors. Introduction to Calculus and Analysis, Volume 1 by Richard Courant and Fritz John warns "We emphasize that this [the differential] has nothing to do with the vague concept of 'infinitely small quantities.'"
