1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Homework Help: Taylor series, an intuitive view

  1. May 5, 2013 #1
    1. The problem statement, all variables and given/known data
    I read that the taylor series was a way to approximate the a function f(x) graphically, by addition and subtraction.
    So say I have [itex]\frac{1}{1-x}=1+x+x^{2}+x^{3}+...+x^{n}...[/itex]
    suppose x=3, then the left and right side of the equation can't possibly equal the same thing. unless I misunderstood again.
    I understand that graphically, the curves of the two functions get closer and closer as n approaches infinity...but I can't seem to see that tendency algebraically.
    I wonder where I went wrong, or where I'm misunderstanding here...
  2. jcsd
  3. May 5, 2013 #2


    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    What you have there is a geometric series with ratio ##x##. You only have convergence of the series if ##|x|<1##. Otherwise the series diverges and you don't get anything.
  4. May 5, 2013 #3


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    To elaborate a bit more, that is a Taylor series about x=0 for 1/(1-x). Every Taylor series comes with a radius of convergence - in this case 1. If you have a Taylor series about x=a with a radius of converge R, then the approximation gets better and better if |x-a|<R, but fails to give useful results if |x-a|>R
  5. May 5, 2013 #4
    of course. I just had to overlook the boundaries...thank you.
  6. May 5, 2013 #5
    about the radius of convergence...when i'm calculating, i understand that i can get a solution algebraically, but sometimes i wonder what it would look like on a graph, because I just can't seem to imagine how there would be a boundary to convergence.
    would I be right to assume it behaves like a [itex]y=x^{4}[/itex] graph, where the ends diverge but the middle converges?
  7. May 5, 2013 #6


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    I don't really understand your example (how is it diverging and the ends and converging in the middle when it's just a single function?) but the description is correct.


    I graphed 1/(1-x) and a couple of approximations with Taylor polynomials. Notice that on the interval [-1,1] the high degree polynomial is a much better approximation than the linear graph. But on [1,2] the high order polynomial has gone off the top of the graph and is never coming back - the linear graph is a better approximation (although it's not a very good one). And again on the left side past x=-1 the high degree polynomial shoots off past the bottom of the graph, and for x<-1 the linear approximation is actually better (but again, not a good approximation).

    Basically what's happening intuitively is that if you have a high degree polynomial, the highest degree term is going to dominate if x is really big. If the coefficients of your Taylor series shrink fast enough (for example, like 1/n!) then the region of "highest degree term dominates" keeps getting pushed out farther and farther, so the region where the Taylor polynomials are good approximations grows as your degree gets higher. If the coefficients don't shrink fast enough, then you still have a region where the polynomials are good approximations, but the region "highest degree term dominates" stays the same, so outside of your radius of convergence the high degree polynomials all shoot off to +/- infinity really fast regardless of what the function does
  8. May 5, 2013 #7
    oh...I get it,, thanks for the graph. I completely misunderstood the definition of diverge and converge. I thought that converging meant that the function approached a finite limit as x approached infinity, because when learning about infinite integrals, we also talked about convergence and divergence...
    but...does this mean that convergence and divergence only applies when there are two or more functions?
    then how would an infinite integral diverge? when I think about convergence, I imagine a curve like [itex]\int\frac{1}{x}dx[/itex] that has endpoints approaching y=0, so we're able to calculate the area inside. but...I just realized that [itex]\int\frac{1}{x}dx[/itex] diverges.
    ok, i don't think I understand the meaning of converge and diverge besides the formula rules. is there a way to explain it graphically or intuitively?
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted