- #1

MathematicalPhysicist

Gold Member

- 4,444

- 267

You are using an out of date browser. It may not display this or other websites correctly.

You should upgrade or use an alternative browser.

You should upgrade or use an alternative browser.

- Thread starter MathematicalPhysicist
- Start date

- #1

MathematicalPhysicist

Gold Member

- 4,444

- 267

- #2

arildno

Science Advisor

Homework Helper

Gold Member

Dearly Missed

- 9,970

- 134

- #3

quasar987

Science Advisor

Homework Helper

Gold Member

- 4,783

- 18

- #4

arildno

Science Advisor

Homework Helper

Gold Member

Dearly Missed

- 9,970

- 134

I said that Weierstrass SYSTEMATIZED the use of this technique; I didn't say he was the the first to have these ideas.quasar987 said:

- #5

selfAdjoint

Staff Emeritus

Gold Member

Dearly Missed

- 6,786

- 9

- #6

quasar987

Science Advisor

Homework Helper

Gold Member

- 4,783

- 18

Cool.

--------------

--------------

- #7

quasar987

Science Advisor

Homework Helper

Gold Member

- 4,783

- 18

- #8

arildno

Science Advisor

Homework Helper

Gold Member

Dearly Missed

- 9,970

- 134

- #9

mathwonk

Science Advisor

Homework Helper

2020 Award

- 11,136

- 1,329

the first reasonably clear definition of continuity may be due to Bolzano, who wrote in 1817 the following definition (in German): f is continuous at x, if and only if "...the difference f(x+h)-f(x) can be made smaller than any given quantity, if h is taken sufficiently small."

In the following proof of the intermediate value theorem he then uses an epsilon for the "given quantity".

Cauchy, writing 6 years later, in 1823, gives the following somewhat less clear definition: " ..the magnitude of the difference f(x+h)-f(x) decreases indeifinitely with that of h."

He clarifies this somewhat as follows: I.e. "an infinitesimal increment in the variable produces an infinitesimal increment in the function itself." where he has explained that an infinitesimal is a quantity whose "successive absolute values decrease indefinitely so as to become less than any given quantity."

the modern definition which needs only to write the letter epsilon for bolzano's "given quantity", as he himself does in his proofs, was published first by heine, a student of weierstrass, in 1874, following lectures of weierstrass.

it is difficult for a modern person to see the real difference in some of these definitions, once one knows what they mean. Indeed I have seen quotations form newton which sounded essentially like the modern definition of limit. at least one is easily persuaded that the master understood the true meaning very well.

indeed if one spells out the meaning of cauchy's definition as he himself gave it, it is the same. so it seems to me that these workers did themselves understand the meaning of continuity as we do today.

there seems little doubt however that, as has been stated, cauchy did not appreciate the distinction between continuity and uniform continutiy, nor that between convergence and uniform convergence, and made errors or at least omissions of that nature.

the translations i have used here are from "A Source book of classical analysis", Harvard university Press, edited by Garrett Birkhoff. :tongue:

as to why weierstrass and not bolzano gets credit for epsilon - delta continuity, it seems there is a distinction between originating a concept and influencing others to do so. i.e. bolzano may have led the way himself, but did so in papers devoted almost exclusively to such foundations, so few followed, while others like cauchy were more interested in applying these notions to questions about integrals and series.

thus people interested in cauchy's theorems gave him credit for introducing the new methods he was using, even though those were in fact more primitive than the earlier ones of bolzano. finally it seems weierstrass and his students used almost the same formulation as bolzano but applied it to current topics of interest.

it is odd for example that a theorem could be known as the bolzano - weierstrass theorem, when the two men worked some 50 years apart. possibly it was discovered first by bolzano but rediscovered and popularized by weierstrass. (of course there is also a stone - weierstrass theorem and a riemann - kempf theorem, and a gauss - bonnet - chern theorem,.....,in which there is 80-120 years separating the two workers, but in those cases the new results mentioned are significant generalizations of the older ones. In fact a friend of mine once asked Stone just when he had worked with Weierstrass.)

on another thread there is a principle mentioned called there the "arnol'd principle", that if a certain concept or theorem carries a person's name, then it is almost certain that person did not originate that principle. it is then mentioned that indeed arnol'd is not responsible for this principle.

In the following proof of the intermediate value theorem he then uses an epsilon for the "given quantity".

Cauchy, writing 6 years later, in 1823, gives the following somewhat less clear definition: " ..the magnitude of the difference f(x+h)-f(x) decreases indeifinitely with that of h."

He clarifies this somewhat as follows: I.e. "an infinitesimal increment in the variable produces an infinitesimal increment in the function itself." where he has explained that an infinitesimal is a quantity whose "successive absolute values decrease indefinitely so as to become less than any given quantity."

the modern definition which needs only to write the letter epsilon for bolzano's "given quantity", as he himself does in his proofs, was published first by heine, a student of weierstrass, in 1874, following lectures of weierstrass.

it is difficult for a modern person to see the real difference in some of these definitions, once one knows what they mean. Indeed I have seen quotations form newton which sounded essentially like the modern definition of limit. at least one is easily persuaded that the master understood the true meaning very well.

indeed if one spells out the meaning of cauchy's definition as he himself gave it, it is the same. so it seems to me that these workers did themselves understand the meaning of continuity as we do today.

there seems little doubt however that, as has been stated, cauchy did not appreciate the distinction between continuity and uniform continutiy, nor that between convergence and uniform convergence, and made errors or at least omissions of that nature.

the translations i have used here are from "A Source book of classical analysis", Harvard university Press, edited by Garrett Birkhoff. :tongue:

as to why weierstrass and not bolzano gets credit for epsilon - delta continuity, it seems there is a distinction between originating a concept and influencing others to do so. i.e. bolzano may have led the way himself, but did so in papers devoted almost exclusively to such foundations, so few followed, while others like cauchy were more interested in applying these notions to questions about integrals and series.

thus people interested in cauchy's theorems gave him credit for introducing the new methods he was using, even though those were in fact more primitive than the earlier ones of bolzano. finally it seems weierstrass and his students used almost the same formulation as bolzano but applied it to current topics of interest.

it is odd for example that a theorem could be known as the bolzano - weierstrass theorem, when the two men worked some 50 years apart. possibly it was discovered first by bolzano but rediscovered and popularized by weierstrass. (of course there is also a stone - weierstrass theorem and a riemann - kempf theorem, and a gauss - bonnet - chern theorem,.....,in which there is 80-120 years separating the two workers, but in those cases the new results mentioned are significant generalizations of the older ones. In fact a friend of mine once asked Stone just when he had worked with Weierstrass.)

on another thread there is a principle mentioned called there the "arnol'd principle", that if a certain concept or theorem carries a person's name, then it is almost certain that person did not originate that principle. it is then mentioned that indeed arnol'd is not responsible for this principle.

Last edited:

- #10

mathwonk

Science Advisor

Homework Helper

2020 Award

- 11,136

- 1,329

- #11

MathematicalPhysicist

Gold Member

- 4,444

- 267

from what i read his approach had left behind the infinitisemals for the the epsilon-delta formulation, but i also read that abraham robinson had resurrected the infinitisimal formulation on a better rigorous formulation, in what now is called Non-Standard Analysis (NSA), how do these approaches differ from eachother, and what makes NSA non-standard?arildno said:

- #12

arildno

Science Advisor

Homework Helper

Gold Member

Dearly Missed

- 9,970

- 134

I hope math-wizzes like Hurkyl, M.G, or mathwonk can give you a bit of solid info on robinson's approach, but here's a few schematic details on the history of analysis that I don't think is too misleading:loop quantum gravity said:from what i read his approach had left behind the infinitisemals for the the epsilon-delta formulation, but i also read that abraham robinson had resurrected the infinitisimal formulation on a better rigorous formulation, in what now is called Non-Standard Analysis (NSA), how do these approaches differ from eachother, and what makes NSA non-standard?

1. As mathwonk has said, the limit concept has been around a long time; formulations by Newton and Bolzano is at times essentially indistinguishable from modern versions. (I would also like to include Archimedes here; his ideas isn't really that far from modern ideas, and the proofs he gives are at times, I believe, up to modern standards of rigour).

2. However, it is by Cauchy we find the origin of the epsilon/delta-formulation (by which ideas of infinitesemals became superfluous), but his texts are at times not careful enough to distinguish between various forms of continuity/convergence (see mathwonk's reply)

3. Weierstrass and his students recognized the difficulties in Cauchy's original work, and took upon themselves to give his maths a major overhaul to make it fully rigorous

4. Robinson realized that it was possible to revive the concepts of infinitesemals, but that in order to do so properly and rigorously, he had to "leave" the number system called "reals", and invent a number system sufficiently subtle to include "infinitesemals" (I think this goes under the name of surreals, but I'm not too sure on that issue).

So, his first task was to develop a good axiomatic structure governing his new number system, and then "translate" the concepts from standard analysis (which lives in the real (or complex) number system)) into his own number system.

- #13

MathematicalPhysicist

Gold Member

- 4,444

- 267

anyway, ive read also about someone's interpratation that replaces the number system to decimals (from the reals), although it puzzled me bacause decimals are just different representatives of the reals, isnt it?!

ill probably need to find it once more to undersatnad what it's really about.

- #14

George Jones

Staff Emeritus

Science Advisor

Gold Member

- 7,428

- 1,071

loop quantum gravity said:from what i read his approach had left behind the infinitisemals for the the epsilon-delta formulation, but i also read that abraham robinson had resurrected the infinitisimal formulation on a better rigorous formulation, in what now is called Non-Standard Analysis (NSA), how do these approaches differ from eachother, and what makes NSA non-standard?

Real numbers can be constructed from natural numbers. Hyperreal numbers, extensions of the real numbers on which non-standard analysis is based, are constructed from non-standard models of the natural numbers.

An elementary calculus book,

"Elementary Calculus: An Approach Using Infinitesimals

On-line Edition, by H. Jerome Keisler

This is a calculus textbook at the college Freshman level based on Abraham Robinson's infinitesimals, which date from 1960. Robinson's modern infinitesimal approach puts the intuitive ideas of the founders of the calculus on a mathematically sound footing, and is easier for beginners to understand than the more common approach via limits.

The First Edition of this book was published in 1976, and a revised Second Edition was published in 1986, both by Prindle, Weber & Schmidt. The book is now out of print and the copyright has been returned to me as the author. I have decided (as of September 2002) to make the book available for free in electronic form at this site. These PDF files were made from the printed Second Edition.",

based on non-standard analysis is available here. Hyperreal numbers are first discussed in a section that begins on page 21. The Epilogue relates non-standard analysis to the standard [itex]\epsilon - \delta[/itex] view of limits.

What little I used to know about the relationship between hyperreals and surreals, I, unfortunately, have forgotten.

Regards,

George

- #15

Hurkyl

Staff Emeritus

Science Advisor

Gold Member

- 14,916

- 19

(small simply means that the elements of the field will fit into a set)

But since the surreals are so large (they do not fit into a set), they're fairly unwieldy.

The reason nonstandard analysis is called "nonstandard" comes from model theory. I could, for example, create a theory by writing down a list of all the axioms of the real numbers in first-order logic. Then, the set of real numbers can be used to form a model of this theory.

However, the real numbers are not the only model of these axioms! There are other models, like the real algebraic numbers, or the hyperreal numbers. These other models are called

Non-standard analysis is based upon studying the relationship between a standard model of analysis (built upon the reals) and a non-standard model of analysis (built upon the hyperreals), thus the name.

- #16

MathematicalPhysicist

Gold Member

- 4,444

- 267

here's the paper i mentioned in my previous post, the author of the ideas listed there is prof alexander abian:

http://www.fidn.org/notes1.pdf [Broken]

http://www.fidn.org/notes1.pdf [Broken]

Last edited by a moderator:

Share: