Power series and taylor

In summary: Second, if the radius of convergence of the taylor series is to be found, it might be helpful to express the nth term in terms of the radius of convergence and the derivatives of the taylor series. Doing this might make it easier to see the pattern of the series and to simplify it.However, this still does not give the expansion around x=a, as Jacobrhcp mentioned.
  • #1
jacobrhcp
169
0
[SOLVED] power series and taylor

Homework Statement



Let f be a function defined by [tex]f(x)=\frac{1+c x^2}{1+x^2}[/tex], and let x be an element of R

for [tex]c\neq1[/tex], find the taylor series around the point a, and find the radius of convergence of the taylor series

Homework Equations



for power series, [tex]\sum|c_k|(x-a)^k[/tex], the radius of convergence is given by [tex]\rho=\frac{1}{limsup |c_k|^{1/k}}[/tex]

the taylor expansion is given by [tex] f(x)=\sum\frac{f^k(a) (x-a)^k}{k!}[/tex]

The Attempt at a Solution



I tried writing out the taylor series;

[tex]f(x)=\frac{1+c a^2}{1+a^2}+\frac{2a^2(c-1)}{(1+a^2)^2}(x-a)+\frac{4a(c-1)(1-a^4)}{2(1+a^2)^4}(x-a)^2+etc... [/tex]

I did this for the first 4 terms but there was no clear pattern to simplify to an infinite summation, in which case I can use the formula for radius of convergence of power series.
 
Last edited:
Physics news on Phys.org
  • #2
Another thing you could do is write
[tex]\frac{1+ cx^2}{1+ x^2}= \frac{1}{1+ x^2}+ \frac{cx^2}{1+ x^2}[/tex]
as two separate series, using the fact that the sum of the geometric series [itex]a+ ar+ ar^2+ \cdot\cdot\cdot+ ar^n\cdot\cdot\cdot[/itex] is
[tex]\frac{a}{1- r}[/itex]
For the first series, take a= 1, [itex]r= -x^2[/itex]. For the second take [itex]a= cx^2[/itex], [itex]r= -x^2[/itex]. Then combine the two series.
 
  • #3
Hey, Halls, that gives you a nice way of getting an expansion around x=0, but jacobrhcp wants to expand around x=a. I know the answer to the radius of convergence question, it's the distance to the nearest pole in the complex plane from x=a. Since the poles are at z=+/-i, that's easy. But I can't find a neat way of expressing this without using complex variables. I also can't find a nice way of expressing the nth term in the taylor series and so I definitely can't find a way to use that expression for radius of convergence.
 
Last edited:
  • #4
To get the expansion about x=a from the expansion of x=0 (which Halls provided a method for) wouldn't it suffice to replace every "x" with (x-a) ?
 
  • #5
Gib Z said:
To get the expansion about x=a from the expansion of x=0 (which Halls provided a method for) wouldn't it suffice to replace every "x" with (x-a) ?

You should know better than that, Gib.
 
  • #6
Never mind me, terrible, terrible day today :(
 
  • #7
Now I can write the function as a summation using the geometric series, yes (thanks by the way).
And the geometric series have radius of conversion 1, so x=±i.

but I still have no clue how to find the taylor expansion, and to my shame I don't even see how Halls provided a way to find the expansion around zero. (isn't the expansion around zero stopped after the first term, because there is a factor 'a' in all other terms but the first?)

Moreover, is in this case the radius of convergence of the taylor series equal to the radius of convergence of the geometric series? I suppose so, but I have no clue how to prove it.
 
Last edited:
  • #8
me neither, but where did you get that from?
 
  • #9
I decided to delete my post. I don't really think what I posted was helpful at all, but I got it from playing around with the expansion of 1/(1+x^2) about a, and isolating the terms in x^k.

Here it is again in case anyone is curious:

[tex]\frac{1}{1+x^2} = \sum_{n=0}^{\infty} \left[ \sum_{k=n}^{\infty} \frac{(-1)^k}{(1-a)^{k+1}} {k \choose n} a^{k-n} \right] x^{2n}[/tex]
 
  • #10
I've thought about it some more, and I am stuck on the exact same problems as Dick, and it bugs me.
 
  • #11
morphism: how did you get that expression?

To all: If that expression is valid, it solves the problem, since the original function is merely this one with additive and multiplicative constants.
 
  • #12
why? it's not more usefull than the standard geometric summation halls gave, is it? It also doesn't give a taylor expansion, nor does it give the radius of convergence easily (though I'm no expert on radius of convergence yet, so I might be wrong on that last part)

thanks by the way, to all.
 
  • #13
The expression is a Taylor series centered around a, the method Halls gave was not. The only reason one might not like it is because the coefficient of the x term is given by another series, but that is perfectly acceptable. Just like you are probably fine with [tex]\int^5_3 \ln x dx[/tex] but might not like [tex]\int^5_3 \int^x_1 \frac{1}{t} dtdx[/tex].

EDIT: By the way, Christmas morning, Merry Christmas everyone =]

EDIT: O wait, I'm quite behind :( Oh well =]
 
Last edited:
  • #14
I suppose this is just another county heard from, but here are my suggestions.

First, the original function might be dealt with by polynomial division, since the degrees of the polynomials in the rational function are equal, as

[tex]\frac{1+ cx^2}{1+ x^2}= c + \frac{1-c}{1+ x^2} , c\neq1 ,[/tex]

so the expansion for [tex]f(x) = \frac{1}{1+ x^2}[/tex] need only be dealt with once.

As for the derivatives [tex]f^{(n)}(x)[/tex], it seems what makes this unpleasant is the differentiation of products that soon emerges. Perhaps we might differentiate [tex]\frac{1}{1+ u}[/tex] , with u = x^2 , instead and without putting everything over a common denominator. The next two derivatives are then

f'(x) = (-1) · (1+u)^(-2) · u'
and
f''(x) = (-1)(-2) · (1+u)^(-3) · u' + (-1) · (1+u)^(-2) · u'' .

The possible virtue of dealing with the derivatives this way is that
u' = 2x, u'' = 2, and all the higher derivatives vanish, so the further derivatives of f(x) do not unfold beyond two terms. Moreover, these two terms can be generalized in a reasonably clear fashion.

f'''(x) = (-1)(-2)(-3) · (1+u)^(-4) · u' + (-1)(-2) · (1+u)^(-3) · u''
+ (-1)(-2) · (1+u)^(-3) · u'' + 0
= (-1)(-2)(-3) · (1+u)^(-4) · u' + 2·(-1)(-2) · (1+u)^(-3) · u'' ;

f^(4) (x) = (-1)(-2)(-3)(-4) · (1+u)^(-5) · u' + (-1)(-2)(-3) · (1+u)^(-4) · u''
+ 2·(-1)(-2)(-3) · (1+u)^(-4) · u'' + 0
= (-1)(-2)(-3)(-4) · (1+u)^(-5) · u' + 3·(-1)(-2)(-3) · (1+u)^(-4) · u'' .

So the nth derivative of f(x) at a becomes

[tex]f^{(n)} (a) = \frac{(-1)^n · (n!) · (2a)}{(1+a^2)^{n+1}}
+ \frac{(-1)^{(n-1)} · (n-1) · (n-1)! · (2)}{(1+a^2)^{n}}[/tex]

I've checked this against the derivatives of [tex]\frac{1}{1+ x^2}[/tex] and it appears to work correctly. You could put this all over a common denominator from here, but it doesn't simplify much.
 
Last edited:
  • #15
We love you dynamicsolo =]
 
  • #16
wow o_O you're my hero

merry christmas

I have just one last question, and then this problem will be completely solved:

for radii of convergence, is it true that radconv(A)=radconv(B)+radconv(C) iff A=B+C?

dynamicsolo, mastermind.

EDIT: SOLVED!
 
Last edited:
  • #17
jacobrhcp said:
for radii of convergence, is it true that radconv(A)=radconv(B)+radconv(C) iff A=B+C?

I presume you are designating series by A, B, and C. If you consider what a radius of convergence represents, you'll see that that statement can't be correct. After all, the sum of a convergent and a divergent series is going to be divergent. So convergence will only occur within the smallest radius among all the series involved in the sum.
 

What is a power series?

A power series is an infinite series of the form ∑n=0^∞ an(x-c)n, where an are coefficients, x is a variable, and c is a constant.

What is a Taylor series?

A Taylor series is a type of power series that represents a function as an infinite sum of terms, where each term is a derivative of the function evaluated at a specific point.

How do you find the coefficients of a power series?

The coefficients of a power series can be found by using the formula an = f(n)(c)/n!, where f(x) is the function being represented, c is the center point of the series, and n is the term number.

What is the purpose of using a power series or Taylor series?

Power series and Taylor series are useful for approximating functions, especially those that are difficult to integrate or differentiate. They can also be used to find the values of a function at points where it is not defined.

What is the difference between a power series and a Taylor series?

A power series is a general representation of a function, while a Taylor series is a specific type of power series that uses derivatives to find the coefficients. Additionally, a Taylor series is centered around a specific point, while a power series can be centered at any point.

Similar threads

  • Calculus and Beyond Homework Help
Replies
1
Views
255
  • Calculus and Beyond Homework Help
Replies
2
Views
711
  • Calculus and Beyond Homework Help
Replies
4
Views
783
  • Calculus and Beyond Homework Help
Replies
27
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
184
  • Calculus and Beyond Homework Help
Replies
2
Views
734
Replies
12
Views
878
  • Calculus and Beyond Homework Help
Replies
1
Views
212
  • Calculus and Beyond Homework Help
Replies
3
Views
413
  • Calculus and Beyond Homework Help
Replies
6
Views
2K
Back
Top