Why Does gcd(a+b, a-b) Only Equal 1 or 2?

PsychonautQQ
Messages
781
Reaction score
10

Homework Statement


Show that gcd(a+b,a-b) is either 1 or 2. (hint, show that d|2a and d|2b)

Homework Equations


d = x(a+b)+y(a-b)

The Attempt at a Solution


so by the definition of divisibility:
a+b = dr
a-b = ds

adding and subtracting these equalities from each other we can arrive at where the hint wanted us to conclude:
(r+s) = 2a/d
(r-s) = 2b/d

Trying to figure out what to do from here, having a hard time using the hint to restrict d to 1 or 2.
 
Physics news on Phys.org
PsychonautQQ said:

Homework Statement


Show that gcd(a+b,a-b) is either 1 or 2. (hint, show that d|2a and d|2b)

Homework Equations


d = x(a+b)+y(a-b)

The Attempt at a Solution


so by the definition of divisibility:
a+b = dr
a-b = ds

adding and subtracting these equalities from each other we can arrive at where the hint wanted us to conclude:
(r+s) = 2a/d
(r-s) = 2b/d

Trying to figure out what to do from here, having a hard time using the hint to restrict d to 1 or 2.

You must have some condition on a and b. Like, are they relatively prime?
 
  • Like
Likes PsychonautQQ
Hey there,

I think a really important concept that you might be forgetting is the principle of linearity with divisibility. Since your gcd divides both a+b and a-b, it is then also true that your gcd will divide ANY linear combination of these integers (assuming integer weights, of course).

Namely, instead of being completely general with divisibility, realize that you can manipulate the variables x and y in the expression d | (a+b)x + (a-b)y to get a form that is equally valid. I think that your book wants you to assume that a and b are coprime integers, as well.

So, why not try playing around and actually picking numerical values for x and y that can help eliminate either a or b?
 
  • Like
Likes PsychonautQQ
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top