Does 1/x approach 0 as x approaches infinity?

  • Thread starter Thread starter Poopsilon
  • Start date Start date
Poopsilon
Messages
288
Reaction score
1
I'm curious if 1/x ~ 0. Technically by the definition I know it's not since lim x→∞ (1/x)/0 = ∞. But I feel like it does satisfy what the 'on the order of twiddles' is trying to measure. Thus I was wondering if maybe we specially define this to be true in the same way we might define 0! = 1.
 
Physics news on Phys.org
OK, your post is making very little sense.

Poopsilon said:
I'm curious if 1/x ~ 0.

What is x?? If x is 0.01, then 1/x is 100. I doubt you would say this is close to 0.

You need to specify x. If you say, if x is large that 1/x ~ 0. This is not-rigorous but meant to be intuitive. You can rigorize it by saying that

\lim_{x\rightarrow +\infty}\frac{1}{x}=0

Technically by the definition I know it's not since lim x→∞ (1/x)/0 = ∞.

Why do you divide by 0?? You can never divide by 0.

But I feel like it does satisfy what the 'on the order of twiddles' is trying to measure. Thus I was wondering if maybe we specially define this to be true in the same way we might define 0! = 1.

What does any of this have to do with 0!=1 ?
 
lol, ok, I'm assuming you know what the relationary symbol ~ means. I'm just taking x to infinity, and I'm looking at the behavior of the function f(x)=1/x as x→∞ compared to the behavior of the the constant function g(x)=0 as x→∞. I don't care about small values of x, just very large values.

The division by 0 is meant to be non-rigorous, and was probably a mistake to write either way. What I was trying to convey was that the standard definition for f(x) ~ g(x) is not satisfied for these two functions. But not because it fails, but more because due to a technicality it cannot even be applied.

The point of this post is that I'm simply observing that the functions f(x)=1/x and g(x)=0 become asymptotic for large values of x. And since that's what the symbol ~ is trying to measure, I was wondering if, since you can't divide by zero, whether we simply define 1/x ~ 0 to be true. That's why I brought up 0! = 1, because it doesn't conform to the standard definition of the factorial: n! = n(n-1)...1, and instead we just define it that way out of convenience/usefulness.

Sort of like considering the fact that 1/x + n ~ n, ... , 1/x + 2 ~ 2, 1/x + 1 ~ 1 are all true, and thus thinking that it might make sense and be convenient to go ahead and define 1/x + 0 ~ 0 to be true as well.
 
Last edited:
Poopsilon said:
lol, ok, I'm assuming you know what the relationary symbol ~ means. I'm just taking x to infinity, and I'm looking at the behavior of the function f(x)=1/x as x→∞ compared to the behavior of the the constant function g(x)=0 as x→∞. I don't care about small values of x, just very large values.

The division by 0 is meant to be non-rigorous, and was probably a mistake to write either way. What I was trying to convey was that the standard definition for f(x) ~ g(x) is not satisfied for these two functions. But not because it fails, but more because due to a technicality it cannot even be applied.

The point of this post is that I'm simply observing that the functions f(x)=1/x and g(x)=0 become asymptotic for large values of x. And since that's what the symbol ~ is trying to measure, I was wondering if, since you can't divide by zero, whether we simply define 1/x ~ 0 to be true. That's why I brought up 0! = 1, because it doesn't conform to the standard definition of the factorial: n! = n(n-1)...1, and instead we just define it that way out of convenience/usefulness.

Sort of like considering the fact that 1/x + n ~ n, ... , 1/x + 2 ~ 2, 1/x + 1 ~ 1 are all true, and thus thinking that it might be convenient to go ahead and define 1/x + 0 ~ 0 to be true as well.

Oh, I see what you mean with ~, I thought you were talking intuitively.

Well, in that case 1/x ~ 0 is simply not true if x\rightarrow +\infty.
 
Argh, but we can't even evaluate 1/x ~ 0, it lies outside the scope of our definition. Thus shouldn't we go ahead and just define it to be true? I think the final sentence in my previous post provides a rather convincing argument.

I'm not sure what you mean by talking intuitively, but either way the definition of f(x) ~ g(x) is lim x→∞ |f(x)|/|g(x)| = 1, just to make sure we are on the same page.
 
Poopsilon said:
Argh, but we can't even evaluate 1/x ~ 0, it lies outside the scope of our definition. Thus shouldn't we go ahead and just define it to be true?

We can't divide by 0, but we can do

\lim_{x\rightarrow +\infty} \frac{0}{1/x}

this limit is clearly 0 and not 1.
 
Well that's true you're right, I still feel like we should make an exception for the case where one of the functions is identically zero and another goes to zero, and call it true in that case.
 
Notice that ~ is used to study the behaviour of functions which go to infinity. It is not used for functions which converge to 0 or another value.

Saying that 1+1/x ~ 1 makes sense and is true, but it's not at all interesting. The interesting things arise when both functionss go to infinity.
 
Here's an intuitive explanation. x does not grow all that quickly; in fact, it's quite slow. x^2 or the exponential or the gamma function are enormously faster, so it would not make sense to say that it is in the largest or fastest growing class of functions imaginable. In fact, there is no such class.

On the other hand, 0 is quite simple the most slowly growing class of function. If 1/x were comparable to it the most slowly growing functions, then we might reason that it's reciprocal grows as fast as is imaginable, which of course is false.

It is not meaningful to compare a positive function to 0, because whatever (positive) unit we choose to measure them in, the positive function is infinitely larger than 0.

Of course, everything I say here is completely non-rigorous, but it may offer an explanation as to why "going to zero" is not comparable to "being equal to zero."
 
Back
Top