Can the Convergence of Taylor Series

polygamma
Messages
227
Reaction score
0

Homework Statement


If \int_{0}^{1} f(x) g(x) \ dx converges, and assuming g(x) can be expanded in a Taylor series at x=0 that converges to g(x) for |x| < 1 (and perhaps for x= -1 as well), will it always be true that \int_{0}^{1} f(x) g(x) \ dx = \int_{0}^{1} f(x) \sum_{n=0}^{\infty} a_{n} x^{n} \ dx?

Will the fact that that the series doesn't converge for x=1 ever be an issue?

A couple of examples are \int_{0}^{1} \frac{f(x)}{1-x} \ dx = \int_{0}^{1} f(x) \sum_{n=0}^{\infty} x^{n} \ dx and \int_{0}^{1} f(x) \ln(1-x) \ dx = -\int_{0}^{1} f(x) \sum_{n=1}^{\infty} \frac{x^{n}}{n} \ dx.

Homework Equations

The Attempt at a Solution


I want to say that it will always be true since it's just a single point. But I don't know if that's sufficient justification.
 
Last edited:
Physics news on Phys.org
Is the sum really inside the integral? If yes: the integral (by definition) does not care about the function value at 1 at all.
If it is outside, it gets more interesting.
 
What kind of integration has been defined in this analysis course? What definition of a definite integral is used? Are you studying measure theory?
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top