Poirot1
- 243
- 0
I have read that by convention gcd(0,0)=0. But surely 1 fits the bill. Everything divides 0, including 1. But since 1 divides everything we must have gcd(0,0)=1
The greatest common divisor (gcd) of (0, 0) is conventionally defined as 0, although some argue it could be 1 since every positive integer divides 0. However, this definition is not universally accepted, as it contradicts established mathematical properties, such as the property that states \(m \cdot \gcd(a, b) = \gcd(m \cdot a, m \cdot b)\) for \(m \geq 0\). Many mathematical texts assert that gcd(0, 0) is undefined or infinite, as no largest positive integer can be determined under these conditions. The discussion highlights the importance of conventions in mathematics to avoid special cases.
PREREQUISITESMathematicians, students of number theory, educators teaching gcd concepts, and anyone interested in the foundations of mathematical definitions and conventions.
Poirot said:I have read that by convention gcd(0,0)=0. But surely 1 fits the bill. Everything divides 0, including 1. But since 1 divides everything we must have gcd(0,0)=1
No, it is a convention: see here. Having $\gcd{(0, 0)} = 1$ would break the following property:What's your source for the convention that gcd(0,0) = 0? I looked at several books on number theory and they all include in the definition of gcd(a,b) the assumption that at least one of a and b is nonzero. Besides, since every positive integer divides 0, there is no greatest common divisor of 0 and 0.
Bacterius said:No, it is a convention: see here. Having $\gcd{(0, 0)} = 1$ would break the following property:
$$m \cdot \gcd{(a, b)} = \gcd{(m \cdot a, m \cdot b)} ~ ~ ~ \text{for} ~ m \geq 0$$
Which would yield $1 = \text{anything}$.
Generally, conventions like these are chosen to minimize the amount of special cases theorems have to deal with. In and of themselves, they are rather trivial - in practice, it's not very useful to know whether $\gcd{(0, 0)} = 0 ~ \text{or} ~ 1$.
I like Serena said:I do not consider Wolfram|Alpha a good reference.
There are many cases where it does not give the proper mathematical answer.
I don't blame it - it's a calculator and not a math reference.
As for gcd(0,0), I don't see specific references to it on wiki or on wolfram|mathworld.
However, both give the definition that it "is the largest positive integer that divides the numbers without a remainder."
Since any large positive integer divides 0, it would follow that gcd(0,0) is undefined (or infinity).