polygamma
- 227
- 0
Homework Statement
If \int_{0}^{1} f(x) g(x) \ dx converges, and assuming g(x) can be expanded in a Taylor series at x=0 that converges to g(x) for |x| < 1 (and perhaps for x= -1 as well), will it always be true that \int_{0}^{1} f(x) g(x) \ dx = \int_{0}^{1} f(x) \sum_{n=0}^{\infty} a_{n} x^{n} \ dx?
Will the fact that that the series doesn't converge for x=1 ever be an issue?
A couple of examples are \int_{0}^{1} \frac{f(x)}{1-x} \ dx = \int_{0}^{1} f(x) \sum_{n=0}^{\infty} x^{n} \ dx and \int_{0}^{1} f(x) \ln(1-x) \ dx = -\int_{0}^{1} f(x) \sum_{n=1}^{\infty} \frac{x^{n}}{n} \ dx.
Homework Equations
The Attempt at a Solution
I want to say that it will always be true since it's just a single point. But I don't know if that's sufficient justification.
Last edited: