Can anyone please review/verify my proofs for gcd problem?

  • Thread starter Thread starter Math100
  • Start date Start date
  • Tags Tags
    Gcd Proofs
Click For Summary
SUMMARY

The discussion revolves around proving properties of the greatest common divisor (gcd) under the condition that gcd(a, b) = 1. The proofs demonstrate that gcd(a+b, a-b) = 1 or 2, gcd(2a+b, a+2b) = 1 or 3, gcd(a+b, a²+b²) = 1 or 2, and gcd(a+b, a²-ab+b²) = 1 or 3. Each proof employs the definition of gcd and properties of divisibility to establish the results definitively.

PREREQUISITES
  • Understanding of gcd and its properties
  • Familiarity with basic algebraic manipulation
  • Knowledge of divisibility rules
  • Experience with mathematical proofs
NEXT STEPS
  • Study the Euclidean algorithm for calculating gcd
  • Explore properties of linear combinations in number theory
  • Learn about modular arithmetic and its applications
  • Investigate advanced gcd properties in polynomial rings
USEFUL FOR

Mathematicians, students studying number theory, educators teaching algebraic concepts, and anyone interested in the properties of gcd and its applications in proofs.

Math100
Messages
817
Reaction score
230
Homework Statement
Assuming that gcd(a, b)=1, prove the following:
(a) gcd(a+b, a-b)=1 or 2.
[Hint: Let d=gcd(a+b, a-b) and show that d##\mid## 2a, d##\mid##2b, and thus that d##\leq##gcd(2a, 2b)=2 gcd(a, b).]
(b) gcd(2a+b, a+2b)=1 or 3.
(c) gcd(a+b, a^{2}+b^{2})=1 or 2.
[Hint: a^{2}+b^{2}=(a+b)(a-b)+2b^{2}.]
(d) gcd(a+b, a^{2}-ab+b^{2})=1 or 3.
[Hint: a^{2}-ab+b^{2}=(a+b)^{2}-3ab.]
Relevant Equations
None.
Proof: (a) Suppose that gcd(a, b)=1.
Let d=gcd(a+b, a-b).
By definition of the greatest common divisor,
we have that d##\mid##(a+b) and d##\mid##(a-b).
This means d##\mid##[(a+b)+(a-b)] and d##\mid##[(a+b)-(a-b)],
so we have d##\mid##2a and d##\mid##2b.
Note that d##\leq##gcd(2a, 2b)=2 gcd(a, b).
Since gcd(a, b)=1, it follows that 2 gcd(a, b)=2(1)=2.
Thus, d##\mid##2, which implies that d=1 or d=2.
Therefore, gcd(a+b, a-b)=1 or 2.
(b) Suppose that gcd(a, b)=1.
Let d=gcd(2a+b, a+2b).
By definition of the greatest common divisor,
we have that d##\mid##(2a+b) and d##\mid##(a+2b).
This means d##\mid##[2(2a+b)-(a+2b)] and d##\mid##[-(2a+b)+2(a+2b)],
so we have d##\mid##3a and d##\mid##3b.
Note that d##\leq##gcd(3a, 3b)=3 gcd(a, b).
Since gcd(a, b)=1, it follows that 3 gcd(a, b)=3(1)=3.
Thus, d##\mid##3, which implies that d=1 or d=3.
Therefore, gcd(2a+b, a+2b)=1 or 3.
(c) Suppose that gcd(a, b)=1.
Let d=gcd(a+b, a^{2}+b^{2}).
By definition of the greatest common divisor,
we have that d##\mid##(a+b) and d##\mid##(a^{2}+b^{2}).
This means d##\mid##[(a^{2}+b^{2})+(a^{2}-b^{2})] and d##\mid##[(a^{2}+b^{2})-(a^{2}-b^{2})],
so we have d##\mid##2a^{2} and d##\mid##2b^{2}.
Note that d##\leq##gcd(2a^{2}, 2b^{2})=2 gcd(a^{2}, b^{2}).
Since gcd(a, b)=1, it follows that gcd(a^{2}, b^{2})=1.
Thus, 2 gcd(a^{2}, b^{2})=2(1)=2, and so d##\mid##2,
which implies that d=1 or d=2.
Therefore, gcd(a+b, a^{2}+b^{2})=1 or 2.
(d) Suppose that gcd(a, b)=1.
Let d=gcd(a+b, a^{2}-ab+b^{2}).
By definition of the greatest common divisor,
we have that d##\mid##(a+b) and d##\mid##(a^{2}-ab+b^{2}).
This means d##\mid##[(a+b)^{2}-(a^{2}-ab+b^{2})], so we have d##\mid##3ab.
Note that each prime divisor k of d must divide either 3, a or b.
Thus, we have that d##\mid##[3a(a+b)-3ab] and d##\mid##[3b(a+b)-3ab],
so d##\mid##3a^{2} and d##\mid##3b^{2}.
Then we get d##\leq##gcd(3a^{2}, 3b^{2})=3 gcd(a^{2}, b^{2}).
Since gcd(a, b)=1, it follows that gcd(a^{2}, b^{2})=1.
Thus, 3 gcd(a^{2}, b^{2})=3(1)=3, and so d##\mid##3,
which implies that d=1 or d=3.
Therefore, gcd(a+b, a^{2}-ab+b^{2})=1 or 3.
 
Physics news on Phys.org
Math100 said:
Homework Statement:: Assuming that gcd(a, b)=1, prove the following:
(a) gcd(a+b, a-b)=1 or 2.
[Hint: Let d=gcd(a+b, a-b) and show that d##\mid## 2a, d##\mid##2b, and thus that d##\leq##gcd(2a, 2b)=2 gcd(a, b).]
(b) gcd(2a+b, a+2b)=1 or 3.
(c) gcd(a+b, a^{2}+b^{2})=1 or 2.
[Hint: a^{2}+b^{2}=(a+b)(a-b)+2b^{2}.]
(d) gcd(a+b, a^{2}-ab+b^{2})=1 or 3.
[Hint: a^{2}-ab+b^{2}=(a+b)^{2}-3ab.]
Relevant Equations:: None.

Note that d≤gcd(3a, 3b)=3 gcd(a, b).
Since gcd(a, b)=1, it follows that 3 gcd(a, b)=3(1)=3.
Thus, d∣3, which implies that d=1 or d=3.
b) From the first and second line, we can conclude ##d \le 3##. But that doesn't imply ##d## divides ##3##. But this can be fixed by changing the first line to "Note that ##d## divides ##\gcd(3a,3b) = 3\gcd(a, b)##". You may want to make a similar change in a, c, d) even though it kind of works out in a) and c) case.

Math100 said:
Homework Statement:: Assuming that gcd(a, b)=1, prove the following:
(a) gcd(a+b, a-b)=1 or 2.
[Hint: Let d=gcd(a+b, a-b) and show that d##\mid## 2a, d##\mid##2b, and thus that d##\leq##gcd(2a, 2b)=2 gcd(a, b).]
(b) gcd(2a+b, a+2b)=1 or 3.
(c) gcd(a+b, a^{2}+b^{2})=1 or 2.
[Hint: a^{2}+b^{2}=(a+b)(a-b)+2b^{2}.]
(d) gcd(a+b, a^{2}-ab+b^{2})=1 or 3.
[Hint: a^{2}-ab+b^{2}=(a+b)^{2}-3ab.]
Relevant Equations:: None.

By definition of the greatest common divisor,
we have that d∣(a+b) and d∣(a^{2}+b^{2}).
This means d∣[(a^{2}+b^{2})+(a^{2}-b^{2})] and d∣[(a^{2}+b^{2})-(a^{2}-b^{2})]
c) Maybe I'm just being slow, but i think it would be helpful to include a line "Since ##d \vert (a+b)##, we have ##d \vert (a+b)(a-b) = a^2 - b^2##. Other than that, i think everything else is correct.
Math100 said:
Homework Statement:: Assuming that gcd(a, b)=1, prove the following:
(a) gcd(a+b, a-b)=1 or 2.
[Hint: Let d=gcd(a+b, a-b) and show that d##\mid## 2a, d##\mid##2b, and thus that d##\leq##gcd(2a, 2b)=2 gcd(a, b).]
(b) gcd(2a+b, a+2b)=1 or 3.
(c) gcd(a+b, a^{2}+b^{2})=1 or 2.
[Hint: a^{2}+b^{2}=(a+b)(a-b)+2b^{2}.]
(d) gcd(a+b, a^{2}-ab+b^{2})=1 or 3.
[Hint: a^{2}-ab+b^{2}=(a+b)^{2}-3ab.]
Relevant Equations:: None.

Note that each prime divisor k of d must divide either 3, a or b.
Thus, we have that d∣[3a(a+b)-3ab] and d∣[3b(a+b)-3ab]
d) I don't think the first line implies the second, i.e., if a prime ##p## divides ##b## but not ##a## or ##3##, then we have ##p## doesn't divide ##3a(a+b) - 3ab##. However, the second line is true since ##d \vert (a+b)## and ##d \vert 3ab##. Other than that, and changing ##d \le 3## to ##d \vert 3##, it looks good to me.
 
  • Like
Likes   Reactions: Math100
I revised this problem, can you please review/verify to see if it's correct this time?

Proof:
(a) Suppose that gcd(a, b)=1.
Let d=gcd(a+b, a-b).
By definition of the greatest common divisor,
we have that d##\mid##(a+b) and d##\mid##(a-b).
This means d##\mid##[(a+b)+(a-b)] and d##\mid##[(a+b)-(a-b)],
so we have d##\mid##2a and d##\mid##2b.
Note that d divides gcd(2a, 2b)=2 gcd(a, b).
Since gcd(a, b)=1, it follows that 2 gcd(a, b)=2(1)=2.
Thus, d##\mid##2, which implies that d=1 or d=2.
Therefore, gcd(a+b, a-b)=1 or 2.
(b) Suppose that gcd(a, b)=1.
Let d=gcd(2a+b, a+2b).
By definition of the greatest common divisor,
we have that d##\mid##(2a+b) and d##\mid##(a+2b).
This means d##\mid##[2(2a+b)-(a+2b)] and d##\mid##[-(2a+b)+2(a+2b)],
so we have d##\mid##3a and d##\mid##3b.
Note that d divides gcd(3a, 3b)=3 gcd(a, b).
Since gcd(a, b)=1, it follows that 3 gcd(a, b)=3(1)=3.
Thus, d##\mid##3, which implies that d=1 or d=3.
Therefore, gcd(2a+b, a+2b)=1 or 3.
(c) Suppose that gcd(a, b)=1.
Let d=gcd(a+b, a^{2}+b^{2}).
By definition of the greatest common divisor,
we have that d##\mid##(a+b) and d##\mid##(a^{2}+b^{2}).
Since d##\mid##(a+b), we have d##\mid##(a+b)(a-b)=a^{2}-b^{2}.
This means d##\mid##[(a^{2}+b^{2})+(a^{2}-b^{2})] and d##\mid##[(a^{2}+b^{2})-(a^{2}-b^{2})],
so we have d##\mid##2a^{2} and d##\mid##2b^{2}.
Note that d divides gcd(2a^{2}, 2b^{2})=2 gcd(a^{2}, b^{2}).
Since gcd(a, b)=1, it follows that gcd(a^{2}, b^{2})=1.
Thus, 2 gcd(a^{2}, b^{2})=2(1)=2, and so d##\mid##2,
which implies that d=1 or d=2.
Therefore, gcd(a+b, a^{2}+b^{2})=1 or 2.
(d) Suppose that gcd(a, b)=1.
Let d=gcd(a+b, a^{2}-ab+b^{2}).
By definition of the greatest common divisor,
we have that d##\mid##(a+b) and d##\mid##(a^{2}-ab+b^{2}).
This means d##\mid##[(a+b)^{2}-(a^{2}-ab+b^{2})],
so we have d##\mid##3ab.
Now we have that d##\mid##[3a(a+b)-3ab] and d##\mid##[3b(a+b)-3ab],
so d##\mid##3a^{2} and d##\mid##3b^{2}.
Note that d divides gcd(3a^{2}, 3b^{2})=3 gcd(a^{2}, b^{2}).
Since gcd(a, b)=1, it follows that gcd(a^{2}, b^{2})=1.
Thus, 3 gcd(a^{2}, b^{2})=3(1)=3, and so d##\mid##3,
which implies that d=1 or d=3.
Therefore, gcd(a+b, a^{2}-ab+b^{2})=1 or 3.

I am so sorry for the late response of this problem. I was too busy in my workplace for the past two days. Now this problem is revised. Please leave any comment or feedback. I want to know if it's good or not.
 
  • Like
Likes   Reactions: fishturtle1
Math100 said:
I revised this problem, can you please review/verify to see if it's correct this time?

Proof:
(a) Suppose that gcd(a, b)=1.
Let d=gcd(a+b, a-b).
By definition of the greatest common divisor,
we have that d##\mid##(a+b) and d##\mid##(a-b).
This means d##\mid##[(a+b)+(a-b)] and d##\mid##[(a+b)-(a-b)],
so we have d##\mid##2a and d##\mid##2b.
Note that d divides gcd(2a, 2b)=2 gcd(a, b).
Since gcd(a, b)=1, it follows that 2 gcd(a, b)=2(1)=2.
Thus, d##\mid##2, which implies that d=1 or d=2.
Therefore, gcd(a+b, a-b)=1 or 2.
(b) Suppose that gcd(a, b)=1.
Let d=gcd(2a+b, a+2b).
By definition of the greatest common divisor,
we have that d##\mid##(2a+b) and d##\mid##(a+2b).
This means d##\mid##[2(2a+b)-(a+2b)] and d##\mid##[-(2a+b)+2(a+2b)],
so we have d##\mid##3a and d##\mid##3b.
Note that d divides gcd(3a, 3b)=3 gcd(a, b).
Since gcd(a, b)=1, it follows that 3 gcd(a, b)=3(1)=3.
Thus, d##\mid##3, which implies that d=1 or d=3.
Therefore, gcd(2a+b, a+2b)=1 or 3.
(c) Suppose that gcd(a, b)=1.
Let d=gcd(a+b, a^{2}+b^{2}).
By definition of the greatest common divisor,
we have that d##\mid##(a+b) and d##\mid##(a^{2}+b^{2}).
Since d##\mid##(a+b), we have d##\mid##(a+b)(a-b)=a^{2}-b^{2}.
This means d##\mid##[(a^{2}+b^{2})+(a^{2}-b^{2})] and d##\mid##[(a^{2}+b^{2})-(a^{2}-b^{2})],
so we have d##\mid##2a^{2} and d##\mid##2b^{2}.
Note that d divides gcd(2a^{2}, 2b^{2})=2 gcd(a^{2}, b^{2}).
Since gcd(a, b)=1, it follows that gcd(a^{2}, b^{2})=1.
Thus, 2 gcd(a^{2}, b^{2})=2(1)=2, and so d##\mid##2,
which implies that d=1 or d=2.
Therefore, gcd(a+b, a^{2}+b^{2})=1 or 2.
(d) Suppose that gcd(a, b)=1.
Let d=gcd(a+b, a^{2}-ab+b^{2}).
By definition of the greatest common divisor,
we have that d##\mid##(a+b) and d##\mid##(a^{2}-ab+b^{2}).
This means d##\mid##[(a+b)^{2}-(a^{2}-ab+b^{2})],
so we have d##\mid##3ab.
Now we have that d##\mid##[3a(a+b)-3ab] and d##\mid##[3b(a+b)-3ab],
so d##\mid##3a^{2} and d##\mid##3b^{2}.
Note that d divides gcd(3a^{2}, 3b^{2})=3 gcd(a^{2}, b^{2}).
Since gcd(a, b)=1, it follows that gcd(a^{2}, b^{2})=1.
Thus, 3 gcd(a^{2}, b^{2})=3(1)=3, and so d##\mid##3,
which implies that d=1 or d=3.
Therefore, gcd(a+b, a^{2}-ab+b^{2})=1 or 3.

I am so sorry for the late response of this problem. I was too busy in my workplace for the past two days. Now this problem is revised. Please leave any comment or feedback. I want to know if it's good or not.
No need to apologize; the proofs look correct to me.
 
  • Like
Likes   Reactions: Math100
Thank you so much for the help!
 

Similar threads

Replies
2
Views
1K
Replies
5
Views
2K
Replies
5
Views
2K
Replies
7
Views
3K
Replies
12
Views
2K
Replies
20
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 19 ·
Replies
19
Views
3K