Proving gcd(a+b,ab) = 1 for Relatively Prime Numbers

PsychonautQQ
Messages
781
Reaction score
10

Homework Statement


Let a and b be relatively prime. Show that gcd(a+b,ab) = 1

Homework Equations


ax+by = 1 for some integers x and y

The Attempt at a Solution


I set gcd(a+b,ab) = d. I'm now trying to show that d = 1 using elementary algebra techniques.
a+b = rd
ab = sd
ax + by = 1I'm kind of stuck here.. am I on the right track? Do I just need to aggresively rearrange stuff until I can express (a+b) and (ab) as a linear combination that equals one? or perhaps arrange them in such away that I show d divides both a and b individually and therefore it must be one since they are relatively prime? any hints?

Update:

a+b = rd ---> b = rd -a

ab = a(rd-a) = (ard)-(a^2) = sd
dividing both sides by d...

ar - (a^2 / d) = s
ar is an integer and s is an integer, so a^2 / d must be an integer, therefore d|a^2.

I employed a similar argument to show that d|b^2. Since d|a^2 and d|b^2 and gcd(a,b) = 1, we can use the fact that gcd(a^2,b^2) = 1 to conclude that d = 1.

Is this a solid argument?
 
Last edited:
Physics news on Phys.org
Your argument looks perfectly sound to me. You can make it a little more elegant by going : ## ard-a^2=rd \implies a^2=(ar-r)d \implies d|a^2 ## , but it's really the same thing.
Note however that this is not the method suggested in the formulation of the problem : you didn't make use of ## ax+by=1 ##. If you want to try that as well, do you know where that equation comes from ?
 
Last edited:
  • Like
Likes PsychonautQQ
Oh wow, didn't even notice that I didn't use the relatively prime information, haha. Yeah, I know about the equation. Thanks for the post yo!
 
PsychonautQQ said:
Oh wow, didn't even notice that I didn't use the relatively prime information, haha. Yeah, I know about the equation. Thanks for the post yo!
No problem. I can see you don't like going by the book, you tackle the problem your own way, which is cool : )
 
  • Like
Likes PsychonautQQ
Thanks ^_^ I appreciate it. I'm pretty new to this math thing but I want to learn it's mysteries so I appreciate the complement :D
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top