# Division by 0

1. Jul 23, 2014

### Nick_85

Hello
This is question is bugging me for years.
Why is division by 0 undefined and not multiplication by 0 undefined?
If any number multiplied by 0 is 0 then the logical answer for any number divided by 0 would be ∞ (for me at least).

If you "define" N/0=∞ and N*0=undefined then there are no contradictions. Did the mathematicians had to choose between n/0=undefined and n*0=undefined?

n*0=0
0/0=n special rule
n/0=∞
∞*0=n special rule

Sorry for my poor english, I am not a native speaker.

2. Jul 23, 2014

### phinds

If you allow that multiplication by 0 gives 0 this produces no difficulties of any kind.

If you allow division by 0 then you can arrive at mathmatically impossiblities (1=0, for example)

3. Jul 23, 2014

### Nick_85

My question is :if you allow division by 0 (=∞) and not allow multiplication by 0, what then? do you arrive at mathmatically impossiblities like your example?

Last edited: Jul 23, 2014
4. Jul 23, 2014

### gopher_p

You can define division by zero any way you want. But most definitions of division by zero turn out to be fairly useless and cause more problems than they solve.

In complex analysis, one sometimes studies the set $\mathbb{C}^*=\mathbb{C}\cup\{\infty\}$. In this setting, functions like $f(z)=\frac{1}{z}$ are defined everywhere - with $f(0)=\infty$ and $f(\infty)=0$ - and it looks like we're permitting division by $0$. But then the algebraic structures involved - in regards to $\mathbb{C}^*$ and in the functions mapping $\mathbb{C}^*$ into itself - start to get a little more complicated. For instance you need to invent special rules for handling multiplication by $\infty$ now. The topological structures are also more complicated. Everything gets more complicated just so a relatively small class of "kinda bad" functions can get lumped in with the "good" ones.

As far as division between reals/rationals/integers (and complex numbers for that matter) goes, those divisions are typically defined in terms of multiplication; i.e. $x\div y=z \Leftrightarrow x=z\times y$ is a(n) (incomplete) definition for division rather than a rule derived from a definition. So from a strictly algebraic standpoint, which I'm guessing is where you're coming from, there is no way to define division by $0$ in a consistent way that actually looks like the way division is defined for the other numbers.

5. Jul 23, 2014

This FAQ may help(It was very helpful to me)

Last edited by a moderator: May 6, 2017
6. Jul 23, 2014

### Nick_85

editing

Last edited: Jul 23, 2014
7. Jul 23, 2014

### symbolipoint

Look very simply at what DIVISION means. Start with a number, and then subtract another number (divisor) from it, adjusting the Start number after each subtraction, until no more of the divisor can be subtracted. If your divisor is zero, then the process will never be finished. The result is not a number; it is not a variable; it is not any value.

8. Jul 23, 2014

### 1MileCrash

It is undefined in the reals. There is no real number x that satisfies 1/0 = x within that arena. Your idea that 1/0 = inf is actually used in many other arenas.

This is completely well-behaved, provided the other definitions of operations regarding infinity are followed.

Take a look at the Riemann Sphere.
http://en.m.wikipedia.org/wiki/Riemann_sphere

There is no contradiction that can be found by saying that 1/0 = inf in the Riemann Sphere. Other operations on infinity do seem impossible to make work (while maintaining valuable properties of operations on the set, that is). For example, I personally know of no example where inf-inf is defined, because this would break many desirable properties.

Basically, the results of arithmetic depend upon the "world" we are working in, and the world is our creation. Although 1/0 is not infinity in the reals with addition and multiplication in the normal way, there is nothing wrong with thinking about other "worlds" where infinity is a number, and 1/0 = inf, and seeing what happens.

This only supports his intuition regarding infinity.