Does the Convergence of \(\Sigma \frac{f(x)}{x}\) Depend on \(\Sigma f(x) = 0\)?

Click For Summary

Homework Help Overview

The discussion revolves around the convergence of the series \(\Sigma \frac{f(x)}{x}\) in relation to the periodic function \(f\) and the condition \(\Sigma f(x) = 0\). Participants are exploring the implications of these series and the conditions under which convergence occurs.

Discussion Character

  • Conceptual clarification, Mathematical reasoning, Problem interpretation, Assumption checking

Approaches and Questions Raised

  • Participants discuss using contradiction to prove the relationship between the two series. There are attempts to clarify the implications of convergence and the conditions under which it holds. Some participants question the validity of certain assumptions regarding absolute convergence and the manipulation of series.

Discussion Status

The discussion is active, with participants providing insights and raising questions about the reasoning presented. Some guidance has been offered regarding the careful handling of convergence arguments, and there is acknowledgment of the complexity involved in the proof. Multiple interpretations of the problem are being explored.

Contextual Notes

Participants note the periodic nature of the function \(f\) and the constraints imposed by the series definitions. There is an emphasis on the need to avoid assuming absolute convergence in their arguments.

CalTech>MIT
Messages
7
Reaction score
0

Homework Statement


Let f:Z\rightarrowR be periodic such that f(x+a) = f(x) for some fixed a\geq1.

Prove that \Sigma ^{infinity}_{x=1} \frac{f(x)}{x} converges if and only if \Sigma ^{a}_{x=1} f(x) = 0.


Homework Equations


n/a


The Attempt at a Solution


Ok, so I have a general idea of how to write the proof. We can do this by contradiction and assume that the second series isn't equal to zero. As a result, the first series becomes similar to the harmonic series? and thus doesn't converge?
 
Physics news on Phys.org
First show that the converse holds.

Then, by contradiction, suppose that \sum_{x=1}^{a-1}{f(x)}=-f(a)-b. Then we put g(a)=f(a)+b and g(x)=f(x) for other values (while still making sure that the function is periodic). Then we got (since both series converge), that

\sum_{x=1}^{+\infty}{\frac{f(x)}{x}}-\sum_{x=1}^{+\infty}{\frac{g(x)}{x}}

converges. But this difference diverges, since it looks very much like a harmonic series.
 
micromass said:
First show that the converse holds.

Then, by contradiction, suppose that \sum_{x=1}^{a-1}{f(x)}=-f(a)-b. Then we put g(a)=f(a)+b and g(x)=f(x) for other values (while still making sure that the function is periodic). Then we got (since both series converge), that

\sum_{x=1}^{+\infty}{\frac{f(x)}{x}}-\sum_{x=1}^{+\infty}{\frac{g(x)}{x}}

converges. But this difference diverges, since it looks very much like a harmonic series.

Its been a while since I took analysis but I would be careful with this approach. It seems to me that you are trying to use two ideas here.
First, is the convergence of a series (seen as a sequence) to conclude that the sum of the limits is the limit of the sums, in this case we do not have absolute convergence so we are not free to assume that we can pair up terms exactly how we want- even if we know that both converge. IE, although it is true that the sum of the limits is the limit of the sums in this case, the sum of the limits equals the limit of the sums of partials sums, not the sum of terms of the sequence.

Second, you are trying to conclude that the second sequence converges by virtue of the convergence of the first(so I am reading), but in this case you have changed the sign of an entire subsequence of the sequence of terms- the convergence of the first series no longer gives you convergence of the second because of this (at least I do not think so, not without absolute convergence).

Have you guys tried induction on a? I am not 100% positive that it works (again because you are not guaranteed absolute convergence), but it might be worth a shot.
 
I know that the series is not conditionally convergent. And I've been very careful not to use this...

Secondly, the convergence of \sum{\frac{g(x)}{x}} does not follow from the convergence of the first series. It follows from the reverse implication of what he's trying to prove (or alternatively from Dirichlets criterion).
 
Ok well I wrote up a reply and then PF timed me out- so that reply went bye bye. In a nutshell-

1) I apologize if I misunderstood your argument, in hindsight I had no good reason to assume that you were using absolute convergence.

2) I am convinced that if argued carefully, your approach works just fine (I had suspected that it did before(except for point 4 below), my point was not to argue that it didn't- I was simply saying be careful with it, this approach needs to be done carefully).

3) "It follows from the reverse implication of what he's trying to prove" - agreed
"or alternatively from Dirichlets criterion" -Nice!

4) "but in this case you have changed the sign of an entire subsequence of the sequence of terms" - this was wrong on my part. I had misread your construction.
 

Similar threads

Replies
26
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
8
Views
2K
  • · Replies 22 ·
Replies
22
Views
4K
  • · Replies 5 ·
Replies
5
Views
2K