MHB What is the value of gcd(0,0) in mathematics?

  • Thread starter Thread starter Poirot1
  • Start date Start date
Click For Summary
SUMMARY

The greatest common divisor (gcd) of (0, 0) is conventionally defined as 0, although some argue it could be 1 since every positive integer divides 0. However, this definition is not universally accepted, as it contradicts established mathematical properties, such as the property that states \(m \cdot \gcd(a, b) = \gcd(m \cdot a, m \cdot b)\) for \(m \geq 0\). Many mathematical texts assert that gcd(0, 0) is undefined or infinite, as no largest positive integer can be determined under these conditions. The discussion highlights the importance of conventions in mathematics to avoid special cases.

PREREQUISITES
  • Understanding of the concept of greatest common divisor (gcd)
  • Familiarity with basic number theory
  • Knowledge of mathematical conventions and their implications
  • Ability to interpret mathematical properties and definitions
NEXT STEPS
  • Research the mathematical definition of gcd in various number theory textbooks
  • Explore the implications of defining gcd(0, 0) as 0 versus 1
  • Investigate the concept of undefined values in mathematics
  • Learn about mathematical conventions and their role in simplifying theorems
USEFUL FOR

Mathematicians, students of number theory, educators teaching gcd concepts, and anyone interested in the foundations of mathematical definitions and conventions.

Poirot1
Messages
243
Reaction score
0
I have read that by convention gcd(0,0)=0. But surely 1 fits the bill. Everything divides 0, including 1. But since 1 divides everything we must have gcd(0,0)=1
 
Mathematics news on Phys.org
Poirot said:
I have read that by convention gcd(0,0)=0. But surely 1 fits the bill. Everything divides 0, including 1. But since 1 divides everything we must have gcd(0,0)=1

What's your source for the convention that gcd(0,0) = 0? I looked at several books on number theory and they all include in the definition of gcd(a,b) the assumption that at least one of a and b is nonzero. Besides, since every positive integer divides 0, there is no greatest common divisor of 0 and 0.
 
What's your source for the convention that gcd(0,0) = 0? I looked at several books on number theory and they all include in the definition of gcd(a,b) the assumption that at least one of a and b is nonzero. Besides, since every positive integer divides 0, there is no greatest common divisor of 0 and 0.
No, it is a convention: see here. Having $\gcd{(0, 0)} = 1$ would break the following property:

$$m \cdot \gcd{(a, b)} = \gcd{(m \cdot a, m \cdot b)} ~ ~ ~ \text{for} ~ m \geq 0$$

Which would yield $1 = \text{anything}$.

Generally, conventions like these are chosen to minimize the amount of special cases theorems have to deal with. In and of themselves, they are rather trivial - in practice, it's not very useful to know whether $\gcd{(0, 0)} = 0 ~ \text{or} ~ 1$.
 
Last edited:
Bacterius said:
No, it is a convention: see here. Having $\gcd{(0, 0)} = 1$ would break the following property:

$$m \cdot \gcd{(a, b)} = \gcd{(m \cdot a, m \cdot b)} ~ ~ ~ \text{for} ~ m \geq 0$$

Which would yield $1 = \text{anything}$.

Generally, conventions like these are chosen to minimize the amount of special cases theorems have to deal with. In and of themselves, they are rather trivial - in practice, it's not very useful to know whether $\gcd{(0, 0)} = 0 ~ \text{or} ~ 1$.

I do not consider Wolfram|Alpha a good reference.
There are many cases where it does not give the proper mathematical answer.
I don't blame it - it's a calculator and not a math reference.

As for gcd(0,0), I don't see specific references to it on wiki or on wolfram|mathworld.
However, both give the definition that it "is the largest positive integer that divides the numbers without a remainder."
Since any large positive integer divides 0, it would follow that gcd(0,0) is undefined (or infinity).
 
I like Serena said:
I do not consider Wolfram|Alpha a good reference.
There are many cases where it does not give the proper mathematical answer.
I don't blame it - it's a calculator and not a math reference.

As for gcd(0,0), I don't see specific references to it on wiki or on wolfram|mathworld.
However, both give the definition that it "is the largest positive integer that divides the numbers without a remainder."
Since any large positive integer divides 0, it would follow that gcd(0,0) is undefined (or infinity).


The assumption was that the people who wrote Wolfram|Alpha probably know their stuff, and would correctly handle the special cases. Of course it's not an official reference, but it's handy to quickly check things :)
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
Replies
5
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 19 ·
Replies
19
Views
5K
  • · Replies 8 ·
Replies
8
Views
2K
Replies
28
Views
4K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 19 ·
Replies
19
Views
6K