Limit of Power Series at x=1 Proof

JG89
Messages
724
Reaction score
1

Homework Statement



If a_v > 0 and \sum_{v=0}^{\infty} a_v converges, then prove that \lim_{x \rightarrow 1^-} \sum_{v=0}^{\infty} a_v x^v = \sum_{v=0}^{\infty} a_v.

Homework Equations


The Attempt at a Solution



Since \sum a_v converges, then we can say that \sum a_v x^v converges for x = 1. This implies that it converges uniformly for
all |x| < 1.

Let f(x) = \sum a_v x^v for |x| < 1. Since the convergence is uniform, we know that f is continuous in its interval of convergence. Since the power series converges as well for x = 1, we can also say that f(x) is "left-hand continuous" at x = 1. That is, for all positive epsilon there exists a positive delta such that |f(x) - f(1)| &lt; \epsilon whenever |x - 1| &lt; \delta, if we only take x approaching 1 from the left.

Note that |f(x) - f(1)| = |\sum a_v x^v - \sum a_v | &lt; \epsilon whenever |x-1| &lt; \delta.

This is precisely the statement: \lim_{x \rightarrow 1^-} \sum_{v=0}^{\infty} a_v x^v = \sum_{v=0}^{\infty} a_v. QED.

The proof seems bullet-proof to me, but what bugs me is that I didn't use the fact that the a_v are positive, so there must be something wrong with the proof...
 
Physics news on Phys.org
This implies that it converges uniformly for
all |x| < 1.

What theorem are you using to justify this (i.e., what is its exact statement)?

I don't see how this problem can be solved without quoting Abel's Theorem, or essentially reproducing its proof. (Also, by Abel's Thm, a_v>0 is not required.)
 
Yeah I realized the mistake I made. I have a theorem in my book stating the uniform convergence for |x| < 1, but I can't say that it is uniformly continuous at 1, I would have to use Abel's theorem. But I found another way to solve the problem. ThanksEDIT: The exact statement of the theorem I'm using: "If a power series in x converges for a value x = c, it converges absolutely for every value x such that |x| < |c|, and the convergence is uniform in every interval |x| <= N, where N is any positive number less than |c|. Here N may lie as near to |c| as we please.

Just a quick question, do we ever talk about uniform convergence in open intervals? Or must we mark off some end points so that we can take off a closed interval?
 
Last edited:
An example of a function with uniform convergence on an open interval (i.e. all of \mathbb{R}) is \sin(x). This can be easily seen once you know the inequality |\sin x - \sin y| \leq |x - y| for all x,y \in \mathbb{R}.
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...

Similar threads

Back
Top