# Convergence of a general series

1. Jul 5, 2011

### ELog

This problem has been bothering me for some time. Any thoughts or insights are greatly appreciated.

Consider a function, f, with continuous third derivative on [-1,1]. Prove that the series
$\sum^{\infty}_{n=1} (nf(\frac{1}{n})-nf(-\frac{1}{n}) - 2\frac{df}{dn}(0))$ converges.

Thanks in advance for any help!

2. Jul 6, 2011

### Petr Mugver

I think the statement is false. Try with f(x) = x.

3. Jul 6, 2011

### HallsofIvy

Staff Emeritus
You also cannot have "$df/dn$" since f is not a function of n.

Perhaps you meant
$$\sum_{n=1}^\infty\left(nf(1/n)+ nf(-1/n)- 2\frac{df}{dx}(0)\right)$$

4. Jul 6, 2011

### I like Serena

Uhh, the statement holds true for f(x)=x (assuming df/dn is actually df/dx as HoI suggested). :uhh:

@HoI: You flipped a sign (typo I presume?). For your current expression f(x)=x will not hold.

5. Jul 6, 2011

### micromass

Staff Emeritus
As for a proof, you might want to take the Taylor expansion of f at 0. A lot of terms will drop and you'll see fun things happening.

6. Jul 6, 2011

### ELog

I found the problem in the form I presented, though the derivative was with respect to x (as HoI pointed out). If I expand f as a taylor series, I think I see the fun of which you speak.

EDIT: All we know about f is that its third derivative is continuous. Does this effect the taylor series (Is it invalid to use derivatives at 0 of order > 3 if we are unsure if they even exist?)?

Last edited: Jul 6, 2011
7. Jul 6, 2011

### I like Serena

You can't use the full Taylor series expansion (you don't even know it converges).

However, you can use Taylor up to 4 terms and use the term with the second or third derivative to define a remainder term that bounds the function result.