- 8

- 0

I'm working with my Calculus book here, and I'm working on the chapter called, "Logarithmic Functions from the Integral Point of View." It works through various proofs of logarithmic functions in a very explicit mathematical manner. However, they are all based on a definition of the natural logarithm that does not seem to match the mathematical rigor that is apparant in almost every other proof in the book. I'm sure this is simply due to my failure to understand a certain aspect of this definition, so I thought I'd ask on here and see if you can clear it up.

Basically, it starts out talking about the history of the natural logarithm. Newton, and a few others were trying to solve the problem of finding the values of x

_{1}, x

_{2}, x

_{3}, etc. for which the the areas A

_{1}, A

_{2}, A

_{3}, under the curve of y = [tex]\frac{1}{x}[/tex] would all be equal.

They found that the x values that satisfy this problem were exp(1), exp(2), exp(3) etc. Due to the relationship between the area under the curve, and the definite integral, we get the following:

[tex]\int_1^{x}{1/t}\,dt = lnx[/tex].

So, basically, it seems that they experimented around with the graph, found that there was this one number that kept coming up as a solution to this problem. This then led to the natural logarithm. The issue I have with this is that it seems it's all built on basically guesswork. They guessed around to find the right x value that would satisfy their area, and they found e. It just doesn't seem as rigorous as everything else. Is there something I'm missing? Or is this just the only way to "Discovery," a transcendental number like e?