Expanding in a Taylor series

1. Jan 6, 2015

polygamma

1. The problem statement, all variables and given/known data
If $\int_{0}^{1} f(x) g(x) \ dx$ converges, and assuming $g(x)$ can be expanded in a Taylor series at $x=0$ that converges to $g(x)$ for $|x| < 1$ (and perhaps for $x= -1$ as well), will it always be true that $\int_{0}^{1} f(x) g(x) \ dx = \int_{0}^{1} f(x) \sum_{n=0}^{\infty} a_{n} x^{n} \ dx$?

Will the fact that that the series doesn't converge for $x=1$ ever be an issue?

A couple of examples are $$\int_{0}^{1} \frac{f(x)}{1-x} \ dx = \int_{0}^{1} f(x) \sum_{n=0}^{\infty} x^{n} \ dx$$ and $$\int_{0}^{1} f(x) \ln(1-x) \ dx = -\int_{0}^{1} f(x) \sum_{n=1}^{\infty} \frac{x^{n}}{n} \ dx.$$

2. Relevant equations

3. The attempt at a solution
I want to say that it will always be true since it's just a single point. But I don't know if that's sufficient justification.

Last edited: Jan 6, 2015
2. Jan 6, 2015

Staff: Mentor

Is the sum really inside the integral? If yes: the integral (by definition) does not care about the function value at 1 at all.
If it is outside, it gets more interesting.

3. Jan 7, 2015

Stephen Tashi

What kind of integration has been defined in this analysis course? What definition of a definite integral is used? Are you studying measure theory?