# Number e

by Hepic
Tags: number
 Sci Advisor HW Helper P: 11,915 Actually a more interesting discussion would be the universality of e: $$\sum_{k=0}^{\infty} \frac{1}{k!} = \lim_{n\rightarrow\infty} \left(1+\frac{1}{n}\right)^n$$
HW Helper
PF Gold
P: 12,016
 Quote by dextercioby Actually a more interesting discussion would be the universality of e: $$\sum_{k=0}^{\infty} \frac{1}{k!} = \lim_{n\rightarrow\infty} \left(1+\frac{1}{n}\right)^n$$
Aah, happy memories!
Reminds me of my student days, when I and a couple of co-students became determined to prove that identity directly.

We were proud of ourselves when we managed to do so, not the least because we found it rather troublesome to achieve.
P: 1,055
 Quote by arildno "Intergral's definition is about area, it has nothing to do with 1/x's antiderivative as a function. " QUITE so. That's why it can be regarded as more fundamental, rather than as silly.
I still regard it as silly. The OP was not asking, as I interpret it, for the best formal definition of e but rather why and how mathematicians came to realize e was important. All of my insight into why e is important came before I saw that integral definition and historically that is the case too as far as I know.
HW Helper
PF Gold
P: 12,016
 Quote by Jorriss I still regard it as silly. The OP was not asking, as I interpret it, for the best formal definition of e but rather why and how mathematicians came to realize e was important. All of my insight into why e is important came before I saw that integral definition and historically that is the case too as far as I know.
You are entitled to your opinions, obviously, but not to your own (historical) facts.

That ends my involvement in this thread.
Mentor
P: 18,240
 Quote by arildno "Intergral's definition is about area, it has nothing to do with 1/x's antiderivative as a function. " QUITE so. That's why it can be regarded as more fundamental, rather than as silly.
The thing is, if I were a student new to calculus and if I were to define ##e## as

$$\int_1^e \frac{1}{x} = 1$$

then I would think "alright, cool". But I wouldn't see at all the importance of what ##e## is. Why would we care about this new number ##e## in the first place? This definition doesn't answer this question.

It's a bit like Rudin does the following, he defines

$$\sin(x) = x - \frac{x^3}{3!} + \frac{x^5}{5!} - ...$$

and then he defines ##\pi## as the smallest positive zero. This definition is correct, it is important and it makes it very easy to derive all the analysis properties. But it doesn't show why I would care about ##\pi##. So in that sense, I don't like the definition.
P: 1,055
 Quote by arildno You are entitled to your opinions, obviously, but not to your own (historical) facts.
I'm not sure how I was listing my own historical facts. To the best of my knowledge the first use of e where e was recognizable in the modern sense is from the standard limit definition - not the integral listed.
Mentor
P: 18,240
 Quote by Jorriss I'm not sure how I was listing my own historical facts. To the best of my knowledge the first use of e where e was recognizable in the modern sense is from the standard limit definition - not the integral listed.
If we rely on wikipedia, then the first uses of ##e## were very closely linked to logarithms. So the question really is how people historically saw logarithms. Did they see it as an anti-derivative or as the inverse of an exponential function. Personally, I prefer the latter definition, but I don't know about the historical one.
Mentor
P: 15,153
 Quote by 1MileCrash Intergral's definition is about area, it has nothing to do with 1/x's antiderivative as a function.
By the fundamental theorem of calculus, it has *everything* to do with 1/x's antiderivative as a function.

Let's expand upon Integral's integral a bit. Define ##g(x)=\int_1^x \frac {dt}{t}##. Here's the key question: For what value of x is this g(x) equal to one? For lack of a better name, let's call the solution to g(x)=1 e. In other words, e is the unique value that satisfies ##\int_1^e \frac {dt}{t} = 1## (i.e., Integral's integral.)

To find this value e it will help to find the inverse function of g(x). Let's call this inverse function f(x), defined by f(g(x))=x.

That f'(x)=f(x) pops right out of this definition. Since g(1)=0, f(0)=1. These two results immediately lead to ##f(x) = \sum_{n=0}^{\infty} \frac {x^n}{n!}##, from whence ##e = \sum_{n=0}^{\infty} \frac {1}{n!}##.
 P: 1,295 Of course it actually has everything to do with the functional antiderivative when we know such things. The point is that the relationship described by the integral does not depend on understanding the relationship between area and antiderivative. This is something that aldrino and I can agree on, and if you are claiming otherwise then any argument you had for the definition being fundamental goes right out the window. And yes, obviously we can derive other definitions and properties about e from an expression that uniquely describes e. The point is that the integral definition of e, no matter how important of a question it answers, or how cool it is, does not give any intuitive understanding of why e is important or where it will appear in mathematics. It is the area of this region, OK, so what? Who cares about this region? If someone asked you what e was and why it was portant, would you really tell them that it was the area of this region? Or even mention that fact?
Engineering
HW Helper
Thanks
P: 7,115
 Quote by 1MileCrash Of course it actually has everything to do with the functional antiderivative when we know such things.
This is the logic, as I learned it in an analysis course (note, "real analysis" not "Calc 1".)

First, define the idea of derivatives (for example using the epsilon-delta definition) and antiderivatives.

Then you can easily prove that the derivative of ##x^n## is ##nx^{n-1}##, for all ##n \ne 0##.

That leaves an unanswered question: what is the antiderivative ##A(x)## of ##1/x##?

You can prove that ##A(x)## has the properties of a logarithm, to some (unknown) base - for example ##A(xy) = A(x) + A(y)## - direct from the definition of the antiderivative.

So you can write ##A(x) = \log_e x##, and Integral's integral defines the value of ##e##.

It then follows that the derivative of the inverse function, ##e^x##, is ##e^x##, and hence we get the power series for ##e^x##.

And after jumping through those hoops to motivate the definition, it's much simpler just to define ##e^x## as a power series (for all complex values of ##x## not just real values), and define ##\ln x## as the inverse function of ##e^x##.
 Mentor P: 7,318 Wow! That's a lot of discussion for such a simple statement. Obviously area is a more fundamental concept then rate of change. History shows that area has been a useful concept since the very first writings of mankind, rate of change is a much later and subtler concept. Expressing e as a upper limit in a integral is not the most computable way of going about it. But for me tying it to a simple geometry is much more meaningful then expressing it as an infinite series or any other more esoteric definitions. This integral is a large portion of the reason e shows up in nature as much as it does.
 Engineering Sci Advisor HW Helper Thanks P: 7,115 I guess it depends which bits of nature you look at. You could argue that ##e^{iz} = \cos z + i \sin z## shows up just as much as the integral. Personally I find the connection with ##\lim_{n \rightarrow \infty}(1 + \frac 1 n) ^n## the least motivational approach. But mathematicians are fairly agnostic about the "right way" to define things. Back in my day, if a math exam paper asked you to prove three results about X, it was perfectly acceptable to define X three different ways for the three proofs, without bothering to show that the definitions were equivalent!
HW Helper
PF Gold
P: 12,016
 Quote by AlephZero Back in my day, if a math exam paper asked you to prove three results about X, it was perfectly acceptable to define X three different ways for the three proofs, without bothering to show that the definitions were equivalent!
As it should be, since you weren't asked to prove the equivalence of those definitions.
 P: 542 Wish I could remember where I read about the early origin of e, long before it was known. What I read stated that farmers that grow seed crops need to put a certain amount of the seed crop aside to plant the next season... but a portion of that put aside will actually be accounting for the seed crop to be put aside for the season after that, and so on ad infinitum, converging to what would later be found as e. So staying above e was crucial to surviving... anyone ever hear about this?
HW Helper
PF Gold
P: 3,288
 Quote by AlephZero Personally I find the connection with ##\lim_{n \rightarrow \infty}(1 + \frac 1 n) ^n## the least motivational approach.
This was how I recall first learning about ##e## in high school. It was motivated by the subject of compound interest. We start with some initial sum ##C## and put it in the bank at an interest rate of ##r##.

How much will this grow after one year? It depends on how often the interest is paid. If paid annually, we will simply have ##C(1+r)##.

More generally, if it is paid ##n## times per year, then we will have ##C(1+r/n)^n##, which is larger than ##C(1+r)## because we were able to earn interest on each interest payment in addition to our original sum.

The best case would be if interest was paid continuously, in which case we get ##Ce^r##, where we have defined ##e = \lim_{n\rightarrow \infty}(1+1/n)^n##.

Of course, we have to prove that this limit exists and perform the simple change of variables to evaluate ##\lim_{n\rightarrow \infty}(1+r/n)^n##.

I certainly found this more motivating than any of the competing definitions. But I like money.

 Related Discussions Introductory Physics Homework 5 Programming & Computer Science 4 Linear & Abstract Algebra 2 Calculus & Beyond Homework 3 Calculus & Beyond Homework 1