Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

About 'e'

  1. Sep 28, 2007 #1
    e=(1/n+1)^n,as n->00,
    though I have used this symbol for so many times, but I still can not understand the true meaning of it. Why does it has such strange characteristic---d(e^x)/dx=e^x,
    and e^x=1+x,as x->0, and why is it able to be used everywhere?
    who can explan this symbol to me?
     
  2. jcsd
  3. Sep 28, 2007 #2
    Not a dumb question at all. This is kind of stuff that we learn in the school when we don't know that things could be proven, and once we learn that things can be proven, we have already learned to accept all the properties of the e.

    There is lots of ways to prove these simple results. Unfortunately, most of them use circular logic :wink:

    I could probably fight my way to these results through series.
     
  4. Sep 28, 2007 #3
    arron: e=(1/n+1)^n,as n->00

    As you have defined, we can proceed to arrive at the form below by allowing N to increase beyond bound and taking limits:

    [tex]e(x)=\sum_{i=0}^\infty\frac{X^i}{i!}[/tex]

    Then differentiating term by term shows the e(x) is its own derivative.

    Of course, you might want to study the conditions which allow us to do such things. But, the above offers an intutive reason. And to continue....

    If we then use the value iX, we can proceed to split the series into real and imaginary parts, resulting in cos(x) + isin(x).

    Using Y=e^(iX), this results in the harmonic equation Y+Y'' = 0. Which allows to to find things like:
    (e^ix +e^-ix)/2 = cos(x). We can also integrate, substituting X = e^Z, X' =e^ZdZ

    [tex]\int\frac{dx}{x}=Inx [/tex]
     
    Last edited: Sep 29, 2007
  5. Sep 29, 2007 #4

    Gib Z

    User Avatar
    Homework Helper

    Arron: what you have is that [tex]e = \lim_{n\to \infty} \left( 1 + \frac{1}{n}\right)^n[/tex]. This comes from a more general result, letting x=1.

    [tex]e^x = \lim{n\to \infty} \left( 1 + \frac{x}{n} \right) ^ n[/tex].

    Well let's DEFINE e^x to be the amazing function which is its own derivative. From that we have a whole family of functions that are their own derivative, ae^x where a is some constant! Let the one we are working on be the function from the family in which a=1.

    From the above equation, lets differentiate the Right hand side, with respect to x. Since the limit does not affect the x, we can ignore that limit in computing the derivative. After applying the power and chain rules appropriately, we get the derivative for be (1+ x/n)^(n-1). Then having the limit again, can you see how the 2 expressions are in fact the same? Hence it's its own derivative, so its e^x.

    We could have approached this from the other way, in which we define [tex]e^x = \lim{n\to \infty} \left( 1 + \frac{x}{n} \right) ^ n[/tex] and show in the same fashion that it's its own derivative, and hence we have the property (e^x)' = e^x.

    I used one definition to get to the other result, and then I used another definition to get to the left over result. That's sort of what jostpuur meant when he said circular logic, I used a definition to get to a result, then that result as a definition to 'prove' the original definition.
     
  6. Sep 29, 2007 #5
    GibZ, there is lot of problems with your post.

    It is not obvious, that such function exists at all, unless its existence is proven. The notation is also problematic, because a^x already has a previous meaning for all constants a. It should not be defined again.

    As you noted, if function f exists so that Df=f, then there is more of those functions too. Hence your earlier definition of f does not specify f uniquely. Noting that the other functions have the form af, where a is constant, and then demanding a=1, still does not give the function uniquely, because f wasn't unique in the first place.

    [tex]
    D \lim_{n\to\infty} f_n = \lim_{n\to\infty} Df_n
    [/tex]
    is never trivial. It is not always even true.
     
  7. Sep 29, 2007 #6

    Gib Z

    User Avatar
    Homework Helper

    "It is not obvious, that such function exists at all, unless its existence is proven. The notation is also problematic, because a^x already has a previous meaning for all constants a. It should not be defined again."

    Sorry, I should have noted the reason that such a function exists and why it would be an exponential function if there was one.
    For a general exponential function f(x)=a^x,
    [tex]f'(x) = \lim_{h\to 0} \frac{a^{x+h} - a^x}{h}[/tex], which comes directly from the definition of the derivative. I guess I made the mistake about the differential operator, but i'm quite sure constants can be taken out of limits.

    Some simplifying gets [tex]f'(x) = a^x \lim_{h\to 0} \frac{a^h - 1}{h}[/tex].
    We can see that this function will be its own derivative of the limit evaluates to 1.
    The following is the furthest thing from a proof possible, but it can give us a hunch: substitute in values of h that are small, like 0.0001, and try a=3. We get 1.0986..quite close to one. How about we try a= 2.7. Then it's 0.9933. We know exponential functions are strictly increasing, so the value of a we seek lies someone between 2.7 and 3. If we used some root finding method (not Newton's method because that requires finding the derivative), we would eventually land somewhere around 2.718281828. That gives me evidence there is some value where the limit evaluates to 1.
     
  8. Sep 29, 2007 #7
    Here's some calculations. This is the way I would start doing this. Using the Newton's binomial formula we get

    [tex]
    e := \lim_{n\to\infty}\big(1 + \frac{1}{n})^n = \lim_{n\to\infty}\sum_{k=0}^n \frac{n!}{k!(n-k)!}\frac{1}{n^k}
    [/tex]

    [tex]
    =\lim_{n\to\infty}\big(1\; +\; \frac{n}{1}\frac{1}{n}\; +\; \frac{n(n-1)}{2}\frac{1}{n^2}\; +\; \frac{n(n-1)(n-2)}{2\cdot 3}\frac{1}{n^3}\; +\; \frac{n(n-1)(n-2)(n-3)}{2\cdot 3\cdot 4}\frac{1}{n^4}\;+\cdots+\;\frac{n!}{n!}\frac{1}{n^n}\big)
    [/tex]

    [tex]
    =\lim_{n\to\infty}\big(1\; +\; 1\; +\; \frac{n-1}{2n}\; +\; \frac{(n-1)(n-2)}{2\cdot 3 n^2}\; +\; \frac{(n-1)(n-2)(n-3)}{2\cdot 3\cdot 4 n^3}\; + \cdots+\;\frac{(n-1)!}{n!}\frac{1}{n^{n-1}}\big)
    [/tex]

    Let us study the individual terms of this sum. It is quite easy to see, that for each k there exists a such polynomial p of order k-2 that we get the following limit

    [tex]
    \frac{(n-1)(n-2)\cdots(n-k+1)}{k!\; n^{k-1}} = \frac{n^{k-1} + p(n)}{k!\; n^{k-1}} \underset{n\to\infty}{\to} \frac{1}{k!}
    [/tex]

    Thus it is very easy to believe, that the limit of the sum gives

    [tex]
    \lim_{n\to\infty}\big(1 + \frac{1}{n}\big)^n = 1 + 1 + \frac{1}{2} + \frac{1}{3!} + \cdots = \sum_{k=0}^{\infty} \frac{1}{k!}
    [/tex]

    I did this calculation in high school, but later understood, that it is in fact not yet a correct proof for this series representation of e. I'll give the following problem as an exercise. Suppose we have numbers

    [tex]
    b_{11}
    [/tex]
    [tex]
    b_{21},\; b_{22}
    [/tex]
    [tex]
    b_{31},\; b_{32},\; b_{33}
    [/tex]
    ...
    so that each limit

    [tex]\lim_{n\to\infty} b_{ni} =: c_i[/tex]

    exists. Is the equation

    [tex]
    \lim_{n\to\infty}\sum_{i=1}^{n} b_{ni} = \sum_{i=1}^{\infty} c_i
    [/tex]
    true if the both sides at least exist? (hint: The correct answer is no)

    Anyway, the previous series representation for e is true, although I personally I have never gone through the rigor proof. I have an idea how it can be done, though.

    How to proceed from this with the exponential function? We can define a function

    [tex]
    \textrm{exp}(x) := \sum_{k=0}^{\infty} \frac{x^k}{k!}
    [/tex]

    You can immediately see that [itex]\textrm{exp}(1)=e^1[/itex]. After proving that exp is continuous, and satisfies the identity

    [tex]
    \textrm{exp}(x+y) = \textrm{exp}(x)\; \textrm{exp}(y)
    [/tex]

    we should be already quite close to [itex]\textrm{exp}(x)=e^x[/itex]. Once the series representation for e^x is available, rest of the properties become easier to prove.
     
    Last edited: Sep 29, 2007
  9. Sep 29, 2007 #8
    Here's another point of view. Suppose we want to find a function

    [tex]
    f:\mathbb{R}\to\mathbb{R}
    [/tex]

    that satisfies the equation Df = f, but don't have a clue what it should be. We can first consider an easier problem. Fix some constant [itex]\Delta x > 0[/itex]. Is there a function

    [tex]
    f:\{0,\;\Delta x,\; 2\Delta x,\; 3\Delta x,\ldots\}\to\mathbb{R}
    [/tex]

    that would satisfy condition f(0)=1 and an equation

    [tex]
    f(x) = \frac{f(x+\Delta x) - f(x)}{\Delta x}?
    [/tex]

    The equation can be turned into a recursion relation

    [tex]
    f(x+\Delta x) = (1 + \Delta x)f(x)
    [/tex]

    With induction one can prove that the solution is

    [tex]
    f(x)=(1 + \Delta x)^{x / \Delta x}
    [/tex]

    Now one might guess that if a function f that is its own derivative exists, it cannot be anything else but what we get with a limit [itex]\Delta x\to 0[/itex] out of this analogous discrete problem. So

    [tex]
    f(x) = \lim_{\Delta x\to 0}(1 + \Delta x)^{x / \Delta x} = \lim_{n\to\infty} \big(1 + \frac{1}{n}\big)^{nx}
    [/tex]

    I would very much like to know how the e was found in the first place. Was it done by the Euler, btw? I wouldn't be surprised if it was something like this.
     
  10. Sep 29, 2007 #9

    Gib Z

    User Avatar
    Homework Helper

    It was by Napier, even before the invention of Calculus. It Napier, who discovered logarithms, used it as a base with special properties. Finally something I know that correct =]
     
  11. Sep 29, 2007 #10
    thanks

    Thanks everyone, I am so apprecited. I am trying to understand you and make it clear.
    There is still a question that I found it could be used everywhere, even in Maxwell distribution equations which belongs to caloricics. Can you explain why it has so many usage?
     
  12. Sep 29, 2007 #11
    In lang's book, e was defined with property that derivative is itself... and start from there and prove uniqueness and existence...
     
  13. Dec 11, 2007 #12

    Integral

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    I find this simple relationship intriguing.

    [tex] \int ^e _1 \frac {dx} x = 1 [/tex]

    That is the area under the inverse curve between 1 and e is exactly 1. Somehow I think nature uses this fact.
     
  14. Dec 11, 2007 #13

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    ex occurs in so many things because ax does. And all exponentials are "interchangeable":
    [tex]a^x= e^{ln(a^x)}= e^{xln(a)}[/tex]
    so that any exponential can be written as "e to a constant times x". It's not a matter of "Nature" using anything but of e being particularly easy for us to use.

    One thing that is done more often now in Calculus books than used to be done is to define ln(x) by [itex]\int_0^x (1/t)dt[/itex]. All of the properties of the log, including [itex]ln(x^y)= yln(x)[/itex], can be proven from that.

    You can then define Exp(x) to be the inverse function to ln(x). Then if y= Exp(x), x= ln(y). For x non-zero, [itex]1= (1/x)ln(y)= ln(y^{1/x})[/itex]. Going back to Exp, [itex]y^{1/x}= Exp(1)[/itex] so that [itex]y= Exp(1)^x[/itex]. If we define e to be Exp(1) then the inverse function to ln(x) is y= ex.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: About 'e'
  1. Stuff about e (Replies: 11)

  2. Constant e (Replies: 5)

Loading...