Understanding the Limit of 1/x as x Approaches 0: Uncovering the Error

  • Thread starter Thread starter coldboyqn
  • Start date Start date
coldboyqn
Messages
8
Reaction score
0
I am doing something that I sure that I'm wrong, but I cannot realize the error. See as below:
<br /> \frac{1}{x}=x\times\frac{1}{(x^2)} ________\(1\)<br />

Taylor Series of \frac{1}{x^2}:
<br /> \frac{1}{x^2}=\frac{1}{\alpha}+\sum_{k=1}^\infty g(k)(x-\alpha)^k<br />

In which
k is from 1 to infinity,
<br /> g(k)=(-1)^k\times\frac{(k+1)!}{\alpha^{(k+2)}k!}\\\\<br /> =\frac{(-1)^k\times(k+1)}{\alpha^{(k+2)}}<br />

Substitute Taylor Series of 1/x^2 into (1), we obtain:
\frac{1}{x}=\frac{x}{\alpha}+\sum_{k=1}^\infty g(k)(x-\alpha)^k
So: \lim_{\substack{x\rightarrow 0}} \frac{1}{x}=\lim_{\substack{x\rightarrow 0}} (\frac{x}{\alpha}+\sum_{k=1}^\infty x\times g(k)(x-\alpha)^k)=0 (??!?)<br />
Can anyone show me, please?
 
Last edited:
Mathematics news on Phys.org
You need to check the radius of convergence of the taylor series. Note if you don't multiply by x, you get the limit as x->0 of 1/x^2 to be 1/a^2, where a is arbitrary. (I'm assuming alpha=a^2)
 
I see that if using the Taylor series above to determine the value of \lim_{\substack{x\rightarrow 0}}\frac{1}{x^2} we will obtain infinity, which is according?
 
You're right. Sorry, I should have read your question more carefully. The problem is that the last limit isn't 0. You can't just plug in 0 to get the limit, as the function isn't continuous at x=0. More careful calculation should show that limit diverges as well.
 
Wouldn't it be best to draw a graph? Especially when it's possible.
 
StatusX said:
The problem is that the last limit isn't 0. You can't just plug in 0 to get the limit, as the function isn't continuous at x=0.
Surely the last limit isn't 0. But my problem is that the strange result I obtain when
I treat \frac{1}{x} as x\times TaylorSeries\_of(\frac{1}{x^2}).
I wonder where is my error when I calculate the limit by this method!
 
It isn't hard to check the series converges precisely in (0,2a), and in this region it converges to 1/x^2. If you multiply it termwise by x, you get a series that converges in [0,2a). But there's no reason to expect that series evaluated at x=0 to give you the same thing as (1/x^2)*x evaluated at 0, since the taylor series did not converge at x=0. You can prove the limit diverges explicitly, and the easiest way to do this is just to prove the series does converge to 1/x in (0,2a).
 
Oh, yes, I see. As the Taylor Series of \frac{1}{x^2} diverges at 0, I cannot simply multiply it with x to evaluate 1/x at 0, right? And it is unreasonable to multiply an expression that diverges (to \infty) with a variable that come to zero and conclude that the multiplied expression come to zero, right?
Thanks for explanation, I understand now.
 

Similar threads

Back
Top