Does the Series Involving a Function with Continuous Third Derivative Converge?

ELog
Messages
6
Reaction score
0
This problem has been bothering me for some time. Any thoughts or insights are greatly appreciated.

Consider a function, f, with continuous third derivative on [-1,1]. Prove that the series
\sum^{\infty}_{n=1} (nf(\frac{1}{n})-nf(-\frac{1}{n}) - 2\frac{df}{dn}(0)) converges.

Thanks in advance for any help!
 
Physics news on Phys.org
I think the statement is false. Try with f(x) = x.
 
You also cannot have "df/dn" since f is not a function of n.

Perhaps you meant
\sum_{n=1}^\infty\left(nf(1/n)+ nf(-1/n)- 2\frac{df}{dx}(0)\right)
 
Uhh, the statement holds true for f(x)=x (assuming df/dn is actually df/dx as HoI suggested). :rolleyes:

@HoI: You flipped a sign (typo I presume?). For your current expression f(x)=x will not hold.
 
As for a proof, you might want to take the Taylor expansion of f at 0. A lot of terms will drop and you'll see fun things happening.
 
I found the problem in the form I presented, though the derivative was with respect to x (as HoI pointed out). If I expand f as a taylor series, I think I see the fun of which you speak.

EDIT: All we know about f is that its third derivative is continuous. Does this effect the taylor series (Is it invalid to use derivatives at 0 of order > 3 if we are unsure if they even exist?)?
 
Last edited:
You can't use the full Taylor series expansion (you don't even know it converges).

However, you can use Taylor up to 4 terms and use the term with the second or third derivative to define a remainder term that bounds the function result.
 
Back
Top