i think, this question can take place on a math-FAQ.
people who think about division with zero seem to forget the meaning of division.
steps of division are, you subtract the divisior from the divdend as long as there's something remaining dividend, ie it hits 0. for each exact subtraction, add 1 to the counter (ie, division). (i'm not covering the details for fractional parts, since my purpose in telling this is to remind the definition)
so, applying this definition to 1/0: subtract 0 from 1 as long as you have a non-0, and add 1 to the counter each time. now, if you do this for 1/0, what's your counter?
similar idea can be applied to 0/0: subtract 0 from 0 until as long as you have a non-0. now, indefinitiny approcahes at this point. no matter how you repeat this (and remember that you're adding a 1 to the result in each step), it'll be valid against the definition of division. 0/0 = 5, 0/0 = 9, 0/0 = anything.
an application can be: x.0 = 0, this equation true for every value of x, ie, it's independent from the contents of x, as long as it's a valid number. dividing two sides by 0, you get x.\frac{0}{0} = \frac{0}{0}. an equation independent from the value of x. it's pretty weird at the first glance, but if you remember what division is, the explanation reveals itself.
i'd like to emphasize that division by zero in not undefined, it's indefinite (or maybe uncertain is a better word), and there're problem-dependent methods for getting rid of the indefinition.