- #1
hyper
- 49
- 0
Lets say we have this series:
a0+ a1(x-k)^1 +a2(x-k)^2 +a3(x-k)^3 = s(x)
If I integrate the series a theorem in the books says that I will get the antiderivate S(x)+C, but won't C allways be equal to zero?
a0+ a1(x-k)^1 +a2(x-k)^2 +a3(x-k)^3 = s(x)
If I integrate the series a theorem in the books says that I will get the antiderivate S(x)+C, but won't C allways be equal to zero?