Does 1/x approach 0 as x approaches infinity?

  • Thread starter Poopsilon
  • Start date
In summary, the conversation revolves around the concept of whether 1/x is approximately equal to 0. While this may not be true according to the standard definition, there is a suggestion to specially define it in order to make the concept more convenient. However, it is pointed out that the definition of ~ is specifically used to study the behavior of functions that go to infinity, not functions that converge to 0. Therefore, the idea of defining 1/x ~ 0 may not be applicable in this case.
  • #1
Poopsilon
294
1
I'm curious if 1/x ~ 0. Technically by the definition I know it's not since lim x→∞ (1/x)/0 = ∞. But I feel like it does satisfy what the 'on the order of twiddles' is trying to measure. Thus I was wondering if maybe we specially define this to be true in the same way we might define 0! = 1.
 
Physics news on Phys.org
  • #2
OK, your post is making very little sense.

Poopsilon said:
I'm curious if 1/x ~ 0.

What is x?? If x is 0.01, then 1/x is 100. I doubt you would say this is close to 0.

You need to specify x. If you say, if x is large that 1/x ~ 0. This is not-rigorous but meant to be intuitive. You can rigorize it by saying that

[tex]\lim_{x\rightarrow +\infty}\frac{1}{x}=0[/tex]

Technically by the definition I know it's not since lim x→∞ (1/x)/0 = ∞.

Why do you divide by 0?? You can never divide by 0.

But I feel like it does satisfy what the 'on the order of twiddles' is trying to measure. Thus I was wondering if maybe we specially define this to be true in the same way we might define 0! = 1.

What does any of this have to do with 0!=1 ?
 
  • #3
lol, ok, I'm assuming you know what the relationary symbol ~ means. I'm just taking x to infinity, and I'm looking at the behavior of the function f(x)=1/x as x→∞ compared to the behavior of the the constant function g(x)=0 as x→∞. I don't care about small values of x, just very large values.

The division by 0 is meant to be non-rigorous, and was probably a mistake to write either way. What I was trying to convey was that the standard definition for f(x) ~ g(x) is not satisfied for these two functions. But not because it fails, but more because due to a technicality it cannot even be applied.

The point of this post is that I'm simply observing that the functions f(x)=1/x and g(x)=0 become asymptotic for large values of x. And since that's what the symbol ~ is trying to measure, I was wondering if, since you can't divide by zero, whether we simply define 1/x ~ 0 to be true. That's why I brought up 0! = 1, because it doesn't conform to the standard definition of the factorial: n! = n(n-1)...1, and instead we just define it that way out of convenience/usefulness.

Sort of like considering the fact that 1/x + n ~ n, ... , 1/x + 2 ~ 2, 1/x + 1 ~ 1 are all true, and thus thinking that it might make sense and be convenient to go ahead and define 1/x + 0 ~ 0 to be true as well.
 
Last edited:
  • #4
Poopsilon said:
lol, ok, I'm assuming you know what the relationary symbol ~ means. I'm just taking x to infinity, and I'm looking at the behavior of the function f(x)=1/x as x→∞ compared to the behavior of the the constant function g(x)=0 as x→∞. I don't care about small values of x, just very large values.

The division by 0 is meant to be non-rigorous, and was probably a mistake to write either way. What I was trying to convey was that the standard definition for f(x) ~ g(x) is not satisfied for these two functions. But not because it fails, but more because due to a technicality it cannot even be applied.

The point of this post is that I'm simply observing that the functions f(x)=1/x and g(x)=0 become asymptotic for large values of x. And since that's what the symbol ~ is trying to measure, I was wondering if, since you can't divide by zero, whether we simply define 1/x ~ 0 to be true. That's why I brought up 0! = 1, because it doesn't conform to the standard definition of the factorial: n! = n(n-1)...1, and instead we just define it that way out of convenience/usefulness.

Sort of like considering the fact that 1/x + n ~ n, ... , 1/x + 2 ~ 2, 1/x + 1 ~ 1 are all true, and thus thinking that it might be convenient to go ahead and define 1/x + 0 ~ 0 to be true as well.

Oh, I see what you mean with ~, I thought you were talking intuitively.

Well, in that case 1/x ~ 0 is simply not true if [itex]x\rightarrow +\infty[/itex].
 
  • #5
Argh, but we can't even evaluate 1/x ~ 0, it lies outside the scope of our definition. Thus shouldn't we go ahead and just define it to be true? I think the final sentence in my previous post provides a rather convincing argument.

I'm not sure what you mean by talking intuitively, but either way the definition of f(x) ~ g(x) is lim x→∞ |f(x)|/|g(x)| = 1, just to make sure we are on the same page.
 
  • #6
Poopsilon said:
Argh, but we can't even evaluate 1/x ~ 0, it lies outside the scope of our definition. Thus shouldn't we go ahead and just define it to be true?

We can't divide by 0, but we can do

[tex]\lim_{x\rightarrow +\infty} \frac{0}{1/x}[/tex]

this limit is clearly 0 and not 1.
 
  • #7
Well that's true you're right, I still feel like we should make an exception for the case where one of the functions is identically zero and another goes to zero, and call it true in that case.
 
  • #8
Notice that ~ is used to study the behaviour of functions which go to infinity. It is not used for functions which converge to 0 or another value.

Saying that 1+1/x ~ 1 makes sense and is true, but it's not at all interesting. The interesting things arise when both functionss go to infinity.
 
  • #9
Here's an intuitive explanation. x does not grow all that quickly; in fact, it's quite slow. x^2 or the exponential or the gamma function are enormously faster, so it would not make sense to say that it is in the largest or fastest growing class of functions imaginable. In fact, there is no such class.

On the other hand, 0 is quite simple the most slowly growing class of function. If 1/x were comparable to it the most slowly growing functions, then we might reason that it's reciprocal grows as fast as is imaginable, which of course is false.

It is not meaningful to compare a positive function to 0, because whatever (positive) unit we choose to measure them in, the positive function is infinitely larger than 0.

Of course, everything I say here is completely non-rigorous, but it may offer an explanation as to why "going to zero" is not comparable to "being equal to zero."
 

1. What does it mean for 1/x to approach 0?

When 1/x approaches 0, it means that as x gets larger and larger, the value of 1/x gets closer and closer to 0. This can also be thought of as the limit of 1/x as x approaches infinity.

2. Is 1/x equal to 0 when x is a large number?

No, 1/x is never equal to 0. As x gets larger, the value of 1/x gets closer to 0, but it will never actually reach 0.

3. Why is it incorrect to say that 1/x equals 0?

It is incorrect to say that 1/x equals 0 because, as explained above, 1/x never actually reaches 0. It may get very close to 0, but it will always have a non-zero value.

4. What is the significance of 1/x approaching 0?

The significance of 1/x approaching 0 is that it is a way to represent very small numbers or values that are close to 0. This concept is important in many areas of mathematics and science, such as calculus and physics.

5. Can the value of 1/x ever be exactly 0?

No, the value of 1/x can never be exactly 0. This is because the denominator, x, can never be equal to 0. If x were equal to 0, the fraction would be undefined.

Similar threads

Replies
1
Views
168
Replies
3
Views
1K
Replies
1
Views
898
Replies
3
Views
1K
Replies
6
Views
638
  • Calculus
Replies
5
Views
1K
Replies
7
Views
2K
Back
Top