Continuity and Differentiability of Infinite Series

AnalysisNewb
Messages
3
Reaction score
0

Homework Statement



I came across a problem where f: (-π/2, π/2)→ℝ where f(x) = \sum\limits_{n=1}^\infty\frac{(sin(x))^n}{\sqrt(n)}

The problem had three parts.

The first was to prove the series was convergent ∀ x ∈ (-π/2, π/2)

The second was to prove that the function f(x) was continuous over the same interval

The third was to prove that the function f(x) was differentiable over the interval

Homework Equations

The Attempt at a Solution



The first part of the problem was simple using the comparison test with the infinite sum of (sin(x))^n, and noting that the geometric series produced converges as long as |sin(x)|<1. Since |sin(x)| = 1 at |π/2|, it is convergent in the interval.

The second part of the problem, I ran into difficulty. I tried to show that f(x) is the limit of a series of continuous functions to prove the continuity of f(x), but then I remembered that for this to work, the series of continuous functions has to be uniformly convergent and not merely convergent.

So, I tried to prove the series was uniformly convergent using the Weierstass M test, but failed because I couldn't find a convergent sequence of M terms that were always greater than the absolute value of the terms of the series.

This same issue with convergent vs. uniformly convergent is what is plaguing me with the third part as well.

Any advice on how to work around this convergent vs. uniformly convergent issue would be greatly appreciated. If I can prove that the series is uniformly convergent, the second and third part are quite easy to prove using the Uniform Limit Theorem for part 2, and then the related theorems about differentiability of series that converge uniformly on an open interval for part 3.
 
Physics news on Phys.org
Is there any reason this function would not be uniformly convergent? The convergence properties are based on n and not x.
I think the wording is that for any epsilon>0, there exists an n such that |f_n(x) - f(x) | < epsilon for all x.
Maybe you need to add a caveat in there, like for -pi/2+delta <x<pi/2-delta, but as you said, it should work out.
If that doesn't work, you could take a Taylor expansion of sin(x) around x_0 and see if you can find that this relation is true:
## \| \sum _ {i=1}^{\infty} \frac{(\sin x_0)^n }{\sqrt{n}} - \sum _ {i=1}^{\infty} \frac{(\sin (x_0+\delta ) )^n }{\sqrt{n}}\| < \epsilon##
 
Yes, your wording for uniform convergence is similar to the one in my book, with the minor exception that it is for any epsilon>0, there exists an N such that for all n>N etc

I'm having some difficulty understanding how the convergence properties aren't based on x.

If I take |f_n(x)-f(x)| < \sum\limits_{m=n+1}^\infty\frac{(\sin(x))^m}{\sqrt(m)} &lt; \sum\limits_{m=n+1}^\infty(\sin(x))^m = \frac{\sin^{n+1}(x)}{1-\sin(x)}.

While \frac{\sin^{n+1}(x)}{1-\sin(x)} can be made < episilon, doesn't its convergence depend on x?

I can't think of a different way to do this, especially since \sum\limits_{n=1}^\infty\frac{1}{\sqrt(n)} is divergent...
 
Last edited:
You're right, I missed that part, so as x goes to pi/2 the n goes to infinity.
Then I would recommend the epsilon delta style proof with the Taylor expansion.
 
RUber said:
Then I would recommend the epsilon delta style proof with the Taylor expansion.

Trying this, it seems like the taylor expansion doesn't aid the convergence because the terms of the resultant series are either unchanged sine series (when they are even), or are slightly increased in amplitude when the exponential term is a product of two of 3 ,5 ,7 , 9, etc

For example
f(x) = (x - x^3/3! + x^5/5! + ... ) + (1/sqrt(2)) ( x - x^3/3! + x^5/5! + ... )^2 + (1/sqrt(3)) ( x - x^3/3! + x^5/5! + ...) +...

= x + \frac{x^2}{\sqrt(2)} - x^3(\frac{1}{3!} + \frac{1}{\sqrt(3)}) + \frac{x^4}{\sqrt(4)} + \frac{x^5}{5!} + ...

This series is less than \sum\limits_{n=1}^\infty\frac{x^n}{\sqrt(n)}

I don't think this is convergent except for certain values of x, again ruling out uniform convergence. (Although I might just be frustrated with all of the terms)

Could it be possible that the initial f(x) = \sum\limits_{n=1}^\infty\frac{\sin^n(x)}{\sqrt(n)} is not uniformly convergent on the interval?
 
I was thinking something on the lines of ##\sin(x_0+\delta)= \sum_{i=0}^\infty \frac{\delta^i}{i!}\frac{d^i}{dx^i}\sin(x_0)##
Not for convergence, but to get straight at continuity.
That should give you a problem that looks like
##\sum_{n=1}^\infty\sum_{i=1}^\infty \frac{\delta^i}{i!\sqrt{n}}<\epsilon##
That too seems problematic.Perhaps you need to first fix x =x_0. Then x_0 is at least some delta from pi/2 or -pi/2 due to the open interval. For any individual x_0, you should be able to show continuity, therefore for all x you can.
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top