lol, ok, I'm assuming you know what the relationary symbol ~ means. I'm just taking x to infinity, and I'm looking at the behavior of the function f(x)=1/x as x→∞ compared to the behavior of the the constant function g(x)=0 as x→∞. I don't care about small values of x, just very large values.
The division by 0 is meant to be non-rigorous, and was probably a mistake to write either way. What I was trying to convey was that the standard definition for f(x) ~ g(x) is not satisfied for these two functions. But not because it fails, but more because due to a technicality it cannot even be applied.
The point of this post is that I'm simply observing that the functions f(x)=1/x and g(x)=0 become asymptotic for large values of x. And since that's what the symbol ~ is trying to measure, I was wondering if, since you can't divide by zero, whether we simply define 1/x ~ 0 to be true. That's why I brought up 0! = 1, because it doesn't conform to the standard definition of the factorial: n! = n(n-1)...1, and instead we just define it that way out of convenience/usefulness.
Sort of like considering the fact that 1/x + n ~ n, ... , 1/x + 2 ~ 2, 1/x + 1 ~ 1 are all true, and thus thinking that it might be convenient to go ahead and define 1/x + 0 ~ 0 to be true as well.