Hello, I'm sorry if I'm posting in the wrong place. As you can see this is my first post and there's something that's been irking me so much that I need clarification. I'm sure some people will be disgusted at some of the logic, but I just want clarification. I do realize that it's all hypothesis with no real mathematical proof, I simply don't have enough knowledge in mathematics to even get close to trying. I'm all ears though, so if there is some mathematical formula to shed some light then I'll try my best to understand it. The basic question I want answered is, "Is Zero a real number?" (insert gag reflex) My train of thought first came from why you can't divide by zero. You can divide something by a number that infinitely approaches zero and it approaches infinity when you look at any graph. As in, x/ε = ±∞. The same can be said for the inverse, so x/∞=ε. I remember the first thing I learned in Calculus was that any number that infinitely approaches a number can be considered the same, as in .999999999......= 1. So why doesn't it work with zero? I just hear people say, "well, zero has a few exceptions because it's an identity element". As far as I can recall, 1 is also an identity element and it's keeping itself in line just fine. So then I thought, "Is Zero a real number?" There's actually another "number" in mathematics where it infinitely approaches the number yet it can't be considered the same, and that is in fact "∞". Why is something infinitely large not considered a real number, yet something infinitely small is considered a real number? So then I thought, "Maybe zero is a concept and it's not a value you can actually reach, just like ∞". This makes me think that there's a good possibility that zero shares more traits with infinity than a real number. This means that 5+0, 5-0, and 5*0 are all something you simply can't do, just like infinity. If you look at it another way, adding, subtracting, and multiplying "nothing(zero)" might also be impossible since it isn't actually a value. By this logic, every instance of zero in current mathematics was actually always ε, so 5+ε=5, 5-ε=5, and 5*ε=ε. I pondered the thought, and found myself deducing that we wouldn't notice any difference whatsoever. I did some brief research on infinitesimals, and I'm pretty sure this is just a different way of approaching non-standard analysis. It was really brief research though, and I don't have any idea where to start. So, why is it important if it doesn't make a difference? To be honest, I don't know. This was the same treatment that non-standard analysis got, in that it was just a different way of achieving the same answer. However, it might make a difference somewhere that I'm not aware of though, like quantum mechanics or something. Maybe non-standard analysis can get somewhere that normal calculus can't because of that little difference. Of course, I could be completely and utterly wrong somewhere. Criticism much appreciated. P.S. I'm not trying to prove anything with this post, I'm looking for something that can clarify it a little more. I realize that I'm just assuming zero is more similar to ∞ than a real number without any proof whatsoever.