Homework Help: Proving 0/0 is undefined

1. Apr 21, 2010

lifelearner

1. The problem statement, all variables and given/known data

Prove that 0/0 is undefined

2. Relevant equations

See above

3. The attempt at a solution

Let 0/0 = a

Then 0/a*0 = 1

But a*0 = 0. So 0(1/0) = 1. But 0 times anything is zero, so 0 = 1, which is not true.

Last edited: Apr 21, 2010
2. Apr 21, 2010

skeptic2

Let 5/5 = a

Then 5/a*5 = 1?

3. Apr 21, 2010

lifelearner

Yes, iff a=1

4. Apr 29, 2010

Martin Rattigan

If you have defined 0/0 then it's defined. If you haven't it's undefined.

To prove that 0/0 is undefined would involve going back through all your past notes and pointing out that everything in them is not a definition of 0/0.

It's probably quicker to say:

Define 0/0=1
Therefore 0/0 is defined.
Therefore the result to be proved is false.

But what is probably required is to point out that there are multiple solutions to 0x=0, e.g. x=0 and x=1 (in which case I would say the question is badly worded).

5. Apr 29, 2010

Mathnomalous

Can't one just say "nothing can't be divided by nothing because there's nothing to divide"?

6. Apr 29, 2010

WhoWee

No - just do what you're told.

7. Apr 30, 2010

Keldon7

I believe the word undefined is well defined.

8. Apr 30, 2010

BerryBoy

I think this question is a bad question for those learning maths, the sinc function for example ($\rm{sinc}\,(x) = \frac{sin x}{x}$) at $x = 0$ becomes:

$$\rm{sinc}\, 0 = \frac{0}{0} = 1$$

but this value is certainly well-defined (obtainable with L'Hopital's rule). I think the quantity zero needs careful treatment as it can be misleading to say the anything divided by zero is undefined.

9. Apr 30, 2010

Hurkyl

Staff Emeritus
Ah, but the sinc function is not sin(x)/x -- the sinc function is the continuous extension of sin(x)/x.

10. Apr 30, 2010

resaypi

suppose f(a) = 0/0
as x goes to a lim f(x) can yield any real number, so it would not be wise to define 0/0 some real number

11. Apr 30, 2010

Martin Rattigan

No doubt meant in jest, but in fact it isn't as well defined as all that. It rather depends on who is defining or not defining.

For example, in some expositions of symbolic logic, "$\vee$" may be an undefined idea while $A\Rightarrow B$ is defined as $\neg A\vee B$, while in others "$\Rightarrow$" may be an undefined idea while $A\vee B$ is defined as $\neg A\Rightarrow B$.

Unfortunately there is no well defined authority to arbitrate on what is and isn't defined (though some American standards bodies have taken upon themselves this task in certain cases).

12. Apr 30, 2010

lifelearner

Is there any way to formulate a proof by contradiction?

EDIT: The following site seems to have an interesting approach to this query:

http://www.friesian.com/zero.htm

Last edited: Apr 30, 2010
13. Apr 30, 2010

Martin Rattigan

Yes. My original suggestion about going back through your past notes could be a proof by contradiction.

That is to say.

Assume 0/0 is not undefined. Then 0/0 is defined.

This means that a statement in the previous exposition of the subject (your notes) defines 0/0. (*)

The statements in your notes are:

1. ...
2. ...

...

n. ...

(You will have to provide the details here)

Then (assuming each is true).

1. doesn't define 0/0.
2. doesn't define 0/0.

...

n. doesn't define 0/0.

Therefore no statement in your notes defines 0/0. (**)

(*) and (**) are a contradiction, therefore the original assumption that 0/0 is not undefined is false.

Therefore 0/0 is undefined.

14. Apr 30, 2010

HallsofIvy

If 0/0 were "defined" then it would be defined to be some specific number. That is there would some specific x such that $\frac{0}{0}= x$. But saying "$\frac{a}{b}= x$" is the same as say a= bx and 0= 0x is true for all x.

By the way, many texts use the term "undefined" to refer to fractions such as $\frac{a}{0}$ where a is non-zero and the term "undetermined" to refer to 0/0.

The reason is that a/0 is undefined because a= 0x is not true for any x (and, in particular, $\lim_{h\to 0}\frac{f(x)}{g(x)}= L$, where f(x) is a function that goes to a nonzero limit and g(x) is a function that goes to 0 as h goes to 0, is not true for any number L) while 0/0 is "undetermined" because 0= 0x is true for all x (and, in particular, $\lim_{h\to 0} \frac{f(x)}{g(x)}= L$ may be true and we can find f and g making L any specificied number).

Last edited by a moderator: May 1, 2010
15. Apr 30, 2010

BerryBoy

Exactly my point, the fact that sin(x)/x for a particular value of x may not be defined can be misleading. Especially when, in reality, we usually deal with continuous variations where undefined values suddenly are defined!??

Apologies lifelearner, we're digressing from the problem.

16. Apr 30, 2010

Martin Rattigan

There is absolutely nothing to stop you defining 0/0 as some specific number. Some computer languages do exactly that. What you can't then do is say $\frac{a}{b}=x$ iff $bx=a$; this would not follow from your definition.

It generally would follow from the definition in other cases, indeed it would usually be a prerequisite for the definition in other cases.

So for the real numbers you would prove that if $b\neq 0$, then $bx=by$ implies $x=y$. You would then be at liberty to define $\frac{a}{b}$, when $b\neq 0$, as the unique number $x$ satisfying $bx=a$. Without the proof of uniqueness the definition is invalid, which is why you can't extend it to the case where b is 0.

At this point $\frac{a}{b}$ would be undefined whenever $b=0$, but it would naturally follow immediately from the definition that when $\frac{a}{b}$ is defined $\frac{a}{b}=x$ iff $bx=a$.

You could then further define 0/0 as some specific number, say

$0/0=_{df}1$

but you could not then say that when $\frac{a}{b}$ is defined $\frac{a}{b}=x$ iff $bx=a$, because this would no longer be true. E.g. $0\times 2=0$, but $\frac{0}{0}=1\neq 2$.

When $\frac{a}{0}$ is defined in the extended complex numbers (for $a\neq 0$), it is then no longer universally true that when $\frac{a}{b}$ is defined $\frac{a}{b}=x$ iff $bx=a$.

Although you are at liberty to define 0/0 as any number you like or anything else for that matter it's probably not a good idea because $\frac{a}{b}=x$ iff $bx=a$ can then fail. But having said it's a bad idea doesn't mean that nobody has ever done it, so if you were to take "undefined" to mean "nobody has ever defined it", you would be hard pressed prove the original statement.

What I was trying to say in my first post is that you shouldn't be in the business of trying to prove or disprove whether or not something is defined. It is defined if you have accepted a definition of it and undefined otherwise; but this may change tomorrow after you've read or been taught a bit more (or adopted definitions of your own).

Last edited: Apr 30, 2010
17. May 1, 2010

jamalahmed68

See how 0/0 is undefined
0/0.1=10
0/0.01=100
0/0.001=1000
0/0.00000000001=100000000000
since 0.00000000001>0
if it is divided by 0 then it's value approaches to infinity..

18. May 1, 2010

jamalahmed68

See how 0/0 is undefined
0/0.1=10
0/0.01=100
0/0.001=1000
0/0.00000000001=100000000000
nevertheless 0.00000000001>0
if it is divided by 0 then it's value approaches to infinity..

19. May 1, 2010

HallsofIvy

Hardly worth posting twice! If, instead of that particular sequence you use .1/.1= 1, .01/.01= 1, .001/.001= 1, ... which converges to 1. Or .2/.1, .02/.01, .002/.001, ... which converges to 2.

$$\lim_{x\to\infty}\frac{f(x)}{g(x)}$$
does not, in general, go to infinity.

Last edited by a moderator: Aug 10, 2018
20. May 1, 2010

Martin Rattigan

Or even if you get the original arithmetic right i.e.:

0/0.1=0
0/0.01=0
0/0.00000000001=0

it converges to 0.