Two problems which have been giving me problems

  • Thread starter Thread starter end3r7
  • Start date Start date
Click For Summary

Homework Help Overview

The discussion revolves around testing series for uniform convergence, specifically focusing on two series involving alternating terms and logarithmic functions. The first series involves the expression \(\sum\limits_{n = 1}^{\infty} \frac{(-1)^n}{n^{x} \ln(x)}\), while the second series is defined as \(f(n,x) = \sum\limits_{n = 1}^{\infty} (-1)^n (1-x^{2})x^{n}\) with multiple convergence tests proposed for the interval [0,1].

Discussion Character

  • Mixed

Approaches and Questions Raised

  • Participants explore various methods for testing convergence, including the p-series test and the alternating series test. Some question the validity of their arguments regarding the behavior of the series near specific points, such as \(x = 1\). Others express uncertainty about the implications of fixing \(x\) versus considering it across the interval.

Discussion Status

There is an ongoing exploration of the convergence properties of the series, with participants offering different perspectives and questioning the assumptions made in their arguments. Some guidance has been provided, particularly regarding the application of convergence tests, but no consensus has been reached on the validity of specific approaches.

Contextual Notes

Participants note the importance of considering the behavior of the series at the endpoints of the interval [0,1], particularly the implications of logarithmic terms and the continuity of functions involved. There is also mention of the need for clarity in the problem statement regarding the domain of \(x\).

end3r7
Messages
168
Reaction score
0

Homework Statement


1) Test the following series for Uniform Convergence
<br /> \sum\limits_{n = 1}^{\inf } {\frac{{( - 1)^n }}{{n^{x}\ln (x)}}} <br />

2) Let f(n,x) = <br /> \sum\limits_{n = 1}^{\inf } {( - 1)^n (1-x^{2})x^{n}} <br />
a) Test for absolutely convergence on [0,1]
b) Test for uniformly convergence on [0,1]
c) Is <br /> \sum\limits_{n = 1}^{\inf } {|f(n,x)|} <br /> absolutely convergent on [0,1]?

Homework Equations


The Attempt at a Solution



For the first, I'm utterly lost. Is there an easy way to deal with such series?

For the second, could I just argue that for all 0<=x<1, there exists a, s.t. x < a <1
and thus
<br /> |\sum\limits_{n = 1}^{\inf } {|f(n,x)|} | &lt;= \sum\limits_{n = 1}^{\inf } {|f(n,x)|} &lt; \sum\limits_{n = 1}^{\inf } {(a)^n} = \frac{a}{1-a}<br />
and for x = 1 and any a > 0
<br /> |\sum\limits_{n = 1}^{\inf } {|f(n,x)|} | &lt;= \sum\limits_{n = 1}^{\inf } {|f(n,x)|} = 0 &lt; \sum\limits_{n = 1}^{\inf } {(a)^n} = \frac{a}{1-a}

This would prove all 3 right? But can I argue taht way? Can I fix my 'x' ahead of time, or does my argument have to work for all x simultaneously? Cuz if it does, then all I would have to do is choose x between a and 1 and the argument would break down.
 
Physics news on Phys.org
I think I figured out two (will post my work later).

But for 1, could I argue that the sequences of partial sums cannot be uniformly cauchy since they are unbounded near x = 1.
 
I confess I might have jumped the gun in saying that I figure out number two. Any help on either would still be greatly appreciated (also, if I could get a mod to give this thread a more decriptive title... I don't think I can change the title).
 
For the first one, how about splitting the sum into two: one for n even, one for n odd. Then you can use the p-series test on each.

The argument you use when 0 <= x < 1 works. When x = 1, what's inside the sum is 0 so the whole sum is 0. I find it unnecessary to consider the absolute value of the sum.
 
Call me stupid (I'd rather you don't though =P), but I'm not sure I understand your approach for the first.

What I did was show that I can get x arbitrarily close to 1, and since log(x) is continuous, I can get arbitrarily close 1. Basically, I showed that for any 'n', I can make <br /> log(x) &lt; \frac{1}{ne} <br />
so for x sufficiently close to 1

<br /> |\frac{1}{{n^{x}\ln (x)}}| &gt; \frac{1}{|{n||\ln (x)|}} &gt; |\frac{ne}{n}| = e<br />

where first inequality holds because x is between 0 and 1.

Therefore the terms of the series do not go to zero uniformly on its domain, so it can't converge.

Is that a valid argument?
 
What I wrote is for testing convergence, not uniform convergence. Sorry about that. I'm not familiar with uniform convergence.
 
e(ho0n3 said:
What I wrote is for testing convergence, not uniform convergence. Sorry about that. I'm not familiar with uniform convergence.

No problemo. =)

(Btw, uniform convergence is essentially the same, but it has to work for arbitrary x in the interval).
 
If x is a fixed constant greater than 0, by the alternating series test, the sum in 1) converges. Does this mean that the sum converges uniformly in the interval (0, \infty)?
 
Oops, I forgot to say that x belongs to [0,1].

Note that log(1) = 0, so it can't converge.

To converge uniformly, it has to converge for every x in the interval.
 
  • #10
Please, if a mod could, edit the first one so x belongs to the closed interval [0,1].

Thanks =)
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
Replies
6
Views
1K
  • · Replies 2 ·
Replies
2
Views
4K
Replies
14
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 7 ·
Replies
7
Views
1K
  • · Replies 13 ·
Replies
13
Views
2K