jppike said:
Before I begin my response I should mention that I am honestly not one of the advocates that infinitesimal calculus should be one's first introduction to the calculus. Indeed, I don't believe I have a sound enough understanding of Non Standard Analysis yet to really make such a decision. I'm just playing devil's advocate:
I actually detest playing devil's advocate. I prefer it if people would just say their opinion, instead of trying to set up a pointless argument. So if it's going to be a devil's advocate thing, then this will be my last reply on the topic.
1) If you're going to argue that in terms of mathematical logic, we have to take something on faith in order to begin a study of the infinitesimal calculus, then you should basically be arguing against all of introductory mathematics. Indeed, in a first introduction to a number of courses we take many things on faith that are deeply rooted into mathematical logic; consider the axiom of choice and induction as the most obvious examples. How often is Zorn's Lemma implicitly used in a first calculus course? Furthermore, in a first course on calculus one is expected to take a lot of results for granted anyways! I recall that in my first calculus course we were expected to just accept, for example, the Intermediate Value Theorem. It wasn't until a year later in analysis that I saw a rigorous proof of it for the Real numbers. In fact, when one is introduced to the calculus for the very first time (say, in high school), one is rarely introduced using Weierstrass's epsilon-delta formalism. A limit is "as a gets arbitrarily close to a, f(x) gets arbitrarily close to L". How is accepting this definition any better than accepting an infinitesimal?
Consider me old-fashioned, but what you accept on the standard approach is much more intuitive than what you accept in the infinitesimal approach. Accepting induction and the axiom of choice are quite obvious things to accept. When I first saw induction, I thought it was very obvious. The same with the axiom of choice (it was only later that I found out that it was problematic). The concept of limit is also an obvious one.
On the other hand, I find infinitesimals less intuitive. I mean: the existence of a number e such that 0<e<1/n for all n. I can hear the questions coming from a high school student:
What's the decimal representation of e??
It doesn't have any.
How can a number not have any representation, is it a fake number??
Uuuh...
Oh, I get it, e=1-0.99999... no??
Hmmm...
In a calculus course, this might be ok. But what in a real analysis course?? This is supposed to be the foundation of analysis, so it's supposed to have a construction of the reals and the hyperreals. This is too hard to do.
OK, you always take things on faith, but not being able to construct the space you're working with is a big no-no.
I realize I am talking in a point-of-view of a standard analyst. Perhaps if I encountered hyperreals in high school and before, then I would talk differently.
2) Non-Standard Analysis is already starting to produce enough applications that I would suggest that a modern mathematician ought to be at least somewhat familiar with it, if not have studied a semester course in it.
To be honest, I never encountered an application of it. I just think it's a neat concept.
In such a case, why is it better to learn standard analysis and then have to try and convert epsilon-delta language into infinitesimal language than the other way around? Indeed, the main argument is that it's not; that once you have learned the intuitive approach using infinitesimals, the epsilon-delta formulation is easy to transition to. At any rate the argument that we shouldn't change the way we do it because that's the way we've been doing it isn't really much of an argument, is it?
It's a huge argument. ALL the textbooks are written in standard language. So a lot of wonderful books like Rudin, Pugh, Spivak, etc. become obsolete. It will be a huge undertaking to correct the books or to produce new ones.
And the standard approach needs to be learned anyway. Almost every research article is written in standard language, while almost no article is written with infinitesimals. So what's the point??
You really have the choice between "teaching the standard approach" and "teaching the standard approach AND infinitesimals". That last thing requires extra time, extra books, perhaps confused students. And that for almost no benefits.
If one can demonstrate that being introduced to NSA makes for a deeper understanding of the concepts of analysis, then that's the way it should be taught, regardless of how we have been teaching it so far.
I doubt that it really makes for a deeper understand of the concepts of analysis. If we teach both standard analysis AND hyperreals, then we need to divide the course in half and spend less time on the more important concepts. That will actually reduce the understanding of the concepts.
The same discussion happens with replacing \pi with \tau=2\pi. It's a useless undertakings. It doesn't matter if \tau is a more intuitive concept. It's too late to change it now.
If I could go back in time and replace \pi with \tau the moment it was invented, then I would do so. But it's too deeply rooted in the community now. The same thing with hyperreals (ignoring the fact that hyperreals require a lot of very very nontrivial logic).