If a > b, where a > 0 and b > 0, then 1/a < 1/b. But what if b = 0? For example, if x > 0, meaning if x is a positive number, then it should be that 1/x > 0. Yes, yes, I know I would be dividing by 0, but it doesn't make sense intuitively. If x is a positive number, then obviously 1/x is a positive number. So it should be that 1/x > 0. Can somebody explain what is going on here? Thank you.