- #1

- 1

- 0

## Homework Statement

If [itex] \int_{0}^{1} f(x) g(x) \ dx [/itex] converges, and assuming [itex]g(x)[/itex] can be expanded in a Taylor series at [itex]x=0[/itex] that converges to [itex]g(x)[/itex] for [itex]|x| < 1[/itex] (and perhaps for [itex] x= -1 [/itex] as well), will it always be true that [itex] \int_{0}^{1} f(x) g(x) \ dx = \int_{0}^{1} f(x) \sum_{n=0}^{\infty} a_{n} x^{n} \ dx [/itex]?

Will the fact that that the series doesn't converge for [itex]x=1[/itex] ever be an issue?

A couple of examples are [tex] \int_{0}^{1} \frac{f(x)}{1-x} \ dx = \int_{0}^{1} f(x) \sum_{n=0}^{\infty} x^{n} \ dx [/tex] and [tex] \int_{0}^{1} f(x) \ln(1-x) \ dx = -\int_{0}^{1} f(x) \sum_{n=1}^{\infty} \frac{x^{n}}{n} \ dx. [/tex]

## Homework Equations

## The Attempt at a Solution

I want to say that it will always be true since it's just a single point. But I don't know if that's sufficient justification.

Last edited: