The greatest common divisor of n integers

Click For Summary
SUMMARY

The discussion focuses on defining the greatest common divisor (gcd) of a set of n integers, S={a_1...a_n}, and proving its existence and representation as a linear combination of the integers involved. The initial claim that gcd(a_1...a_n) equals the minimum of gcd(a_i, a_j) for i,j in {1...n} is proven incorrect with a counterexample of the set {6,10,15}. The correct approach involves using induction and Bézout's identity, demonstrating that the gcd can be expressed as a linear combination of the integers in the set.

PREREQUISITES
  • Understanding of gcd and its properties
  • Familiarity with Euclid's Algorithm
  • Knowledge of Bézout's identity
  • Basic principles of mathematical induction
NEXT STEPS
  • Study Euclid's Algorithm for computing gcd
  • Learn about Bézout's identity and its applications
  • Explore mathematical induction techniques in proofs
  • Investigate properties of linear combinations in number theory
USEFUL FOR

Mathematics students, educators, and anyone interested in number theory, particularly those studying gcd and its applications in algebra and proofs.

Ratpigeon
Messages
52
Reaction score
0

Homework Statement


define the gcd of a set of n integers, S={a_1...a_n}
Prove that exists and can be written as
q_1 a_1+...+q_n a_n for some integers, q_1...q_n

Homework Equations



Euclid's Algorithm?

The Attempt at a Solution



I have the statement that gcd(a_1...a_n) = min( gcd(a_i, a_j) i,j\in{1...n})

and then I know that I can say since any two numbers have a gcd, the gcd exists, and can be represented as a linear combination of the a_i and a_j that have the minimal gcd, but I don't know how to prove the first statement.
 
Physics news on Phys.org
I say this should be moved to the Precalculus help page...

Anyhow... :) Well, for a start, your statement is NOT true.
Consider {6,10,15}. Then gcd (6,10,15) = 1, but gcd(6,10) = 2, gcd(6,15) = 3, and gcd(10,15) = 10.

Try solving it for n = 3. See if that gives you any ideas on how to approach the general case.
 
This is easily done by induction (over the number of arguments) from gcd(a_1, a_2) = q_1 a_1 + q_2 a_2.
 
Last edited:
voko said:
This is done by induction.

Assume that gcd(a_1, a_2) = q_1 a_1 + q_2 a_2. This is indeed the case and is known as Bézout's identity (see below).

Now assume gcd(a_1, ..., a_n) = q_1 a_1 + ... + q_n a_n, and check for gcd(a_1, ..., a_n, a_{n + 1}).

It is trivial to show that gcd(a_1, ..., a_n, a_{n + 1}) = gcd(gcd(a_1, ..., a_n), a_{n + 1}), that is, gcd(a_1, ..., a_n, a_{n + 1}) = A gcd(a_1, ..., a_n) + B a_{n + 1} = A (q_1 a_1 + ... + q_n a_n) + B a_{n + 1} = A q_1 a_1 + ... + A q_n a_n + B a_{n + 1}, which is what we are looking for.

You shouldn't merely give the answer - next time, try to lead the OP toward the correct solution. :)
 
Thanks, i didnt catch the counterexample which set me off on the wrong track. Should be fine now
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
Replies
2
Views
1K
  • · Replies 24 ·
Replies
24
Views
6K
Replies
9
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 35 ·
2
Replies
35
Views
6K
  • · Replies 6 ·
Replies
6
Views
3K
Replies
9
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K