Poirot1
- 243
- 0
I have read that by convention gcd(0,0)=0. But surely 1 fits the bill. Everything divides 0, including 1. But since 1 divides everything we must have gcd(0,0)=1
Poirot said:I have read that by convention gcd(0,0)=0. But surely 1 fits the bill. Everything divides 0, including 1. But since 1 divides everything we must have gcd(0,0)=1
No, it is a convention: see here. Having $\gcd{(0, 0)} = 1$ would break the following property:What's your source for the convention that gcd(0,0) = 0? I looked at several books on number theory and they all include in the definition of gcd(a,b) the assumption that at least one of a and b is nonzero. Besides, since every positive integer divides 0, there is no greatest common divisor of 0 and 0.
Bacterius said:No, it is a convention: see here. Having $\gcd{(0, 0)} = 1$ would break the following property:
$$m \cdot \gcd{(a, b)} = \gcd{(m \cdot a, m \cdot b)} ~ ~ ~ \text{for} ~ m \geq 0$$
Which would yield $1 = \text{anything}$.
Generally, conventions like these are chosen to minimize the amount of special cases theorems have to deal with. In and of themselves, they are rather trivial - in practice, it's not very useful to know whether $\gcd{(0, 0)} = 0 ~ \text{or} ~ 1$.
I like Serena said:I do not consider Wolfram|Alpha a good reference.
There are many cases where it does not give the proper mathematical answer.
I don't blame it - it's a calculator and not a math reference.
As for gcd(0,0), I don't see specific references to it on wiki or on wolfram|mathworld.
However, both give the definition that it "is the largest positive integer that divides the numbers without a remainder."
Since any large positive integer divides 0, it would follow that gcd(0,0) is undefined (or infinity).