Some proofs involving greatest common divisors

  • Thread starter Thread starter Andrusko
  • Start date Start date
  • Tags Tags
    Proofs
Click For Summary
SUMMARY

This discussion focuses on proving three properties related to greatest common divisors (gcd). The first proof establishes that if gcd(a, b) = c, then gcd(a, a+b) = c, using properties of divisibility. The second proof attempts to show that if gcd(a, b) = c and a = a'c, b = b'c, then gcd(a', b') = 1, although the proof lacks clarity. The third proof involves demonstrating that if there exist integers r and s such that rx + sy = 1, then gcd(x, y) = 1, with suggestions for a structured approach to the proof.

PREREQUISITES
  • Understanding of greatest common divisors (gcd)
  • Familiarity with basic number theory concepts
  • Knowledge of divisibility and integer properties
  • Experience with proof techniques, particularly contradiction
NEXT STEPS
  • Study the properties of gcd and their proofs in number theory
  • Learn about the Euclidean algorithm for computing gcd
  • Explore the concept of linear combinations in relation to gcd
  • Investigate the implications of Bézout's identity in number theory
USEFUL FOR

Students of number theory, mathematicians interested in proofs involving gcd, and anyone looking to strengthen their understanding of divisibility and integer properties.

Andrusko
Messages
42
Reaction score
0
Hey all, I'm an absolute noob to number theory stuff and I've got this assignment to do with a few proofs.

Homework Statement



Proove that:

i) if gcd(a,b) = c then gcd(a,a+b) = c

ii) if gcd(a,b) = c and a = a'c and b = b'c then gcd(a',b') = 1

iii) if there exists r,s such that rx + sy = 1 then gcd(x,y) = 1

Homework Equations




The Attempt at a Solution



i) pretty sure this is right, I used a contradiction thing at the end:

prove that if gcd(a,b) = c then gcd(a,a+b) = c

c|a and c|b
a = a'c and b = b'c
so a+b = a'c + b'c
= (a' + b')c
implies c|(a+b)

suppose there exists d: gcd(a,a+b) = d, d>=c

we know c|a and c|(a+b)
and d|a and d|(a+b)

thus d|c

d cannot be greater than c

so d = c

------------------------------

ii) this one's a bit shaky:

prove that if gcd(a,b) = c and a = a'c and b = b'c then gcd(a',b') = 1

now c = ra + sb (r,s integers)
so c = r(a'c) + s(b'c)
= ra'c + sb'c
= (ra' + sb')c

dividing through by c gives:

1 = ra' + sb'

and doesn't this imply that gcd(a',b') = 1?

I don't think this is the right way to prove it.

iii) No idea where to start with this. I guess I would use a method as above, but I doubt the validity of it.

Thanks for any help!
 
Physics news on Phys.org
The second one is very intuitive: if there were any number d dividing both a' and b', then it would also divide a and b and therefore should be a factor in the gcd (i.e. it is already in c). Perhaps you can use this in your proof (suppose that gcd(a', b') = d, not equal to 1, then c is not the gcd).

For iii) I suggest writing d = gcd(x, y), x = x' d, y = y' d and showing that d must be 1.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
Replies
6
Views
1K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 23 ·
Replies
23
Views
2K
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
Replies
2
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K