Can anyone please review/verify my proofs for gcd problem?

  • Thread starter Thread starter Math100
  • Start date Start date
  • Tags Tags
    Gcd Proofs
Click For Summary
The discussion centers on verifying proofs related to the greatest common divisor (gcd) of various expressions given that gcd(a, b) = 1. The proofs demonstrate that gcd(a+b, a-b) can equal 1 or 2, gcd(2a+b, a+2b) can equal 1 or 3, gcd(a+b, a²+b²) can equal 1 or 2, and gcd(a+b, a²-ab+b²) can equal 1 or 3. Each proof utilizes properties of gcd and divisibility to derive the results, with suggestions for clarifying certain steps and ensuring logical consistency. Overall, the revisions aim to enhance clarity and correctness in the proofs presented.
Math100
Messages
816
Reaction score
229
Homework Statement
Assuming that gcd(a, b)=1, prove the following:
(a) gcd(a+b, a-b)=1 or 2.
[Hint: Let d=gcd(a+b, a-b) and show that d##\mid## 2a, d##\mid##2b, and thus that d##\leq##gcd(2a, 2b)=2 gcd(a, b).]
(b) gcd(2a+b, a+2b)=1 or 3.
(c) gcd(a+b, a^{2}+b^{2})=1 or 2.
[Hint: a^{2}+b^{2}=(a+b)(a-b)+2b^{2}.]
(d) gcd(a+b, a^{2}-ab+b^{2})=1 or 3.
[Hint: a^{2}-ab+b^{2}=(a+b)^{2}-3ab.]
Relevant Equations
None.
Proof: (a) Suppose that gcd(a, b)=1.
Let d=gcd(a+b, a-b).
By definition of the greatest common divisor,
we have that d##\mid##(a+b) and d##\mid##(a-b).
This means d##\mid##[(a+b)+(a-b)] and d##\mid##[(a+b)-(a-b)],
so we have d##\mid##2a and d##\mid##2b.
Note that d##\leq##gcd(2a, 2b)=2 gcd(a, b).
Since gcd(a, b)=1, it follows that 2 gcd(a, b)=2(1)=2.
Thus, d##\mid##2, which implies that d=1 or d=2.
Therefore, gcd(a+b, a-b)=1 or 2.
(b) Suppose that gcd(a, b)=1.
Let d=gcd(2a+b, a+2b).
By definition of the greatest common divisor,
we have that d##\mid##(2a+b) and d##\mid##(a+2b).
This means d##\mid##[2(2a+b)-(a+2b)] and d##\mid##[-(2a+b)+2(a+2b)],
so we have d##\mid##3a and d##\mid##3b.
Note that d##\leq##gcd(3a, 3b)=3 gcd(a, b).
Since gcd(a, b)=1, it follows that 3 gcd(a, b)=3(1)=3.
Thus, d##\mid##3, which implies that d=1 or d=3.
Therefore, gcd(2a+b, a+2b)=1 or 3.
(c) Suppose that gcd(a, b)=1.
Let d=gcd(a+b, a^{2}+b^{2}).
By definition of the greatest common divisor,
we have that d##\mid##(a+b) and d##\mid##(a^{2}+b^{2}).
This means d##\mid##[(a^{2}+b^{2})+(a^{2}-b^{2})] and d##\mid##[(a^{2}+b^{2})-(a^{2}-b^{2})],
so we have d##\mid##2a^{2} and d##\mid##2b^{2}.
Note that d##\leq##gcd(2a^{2}, 2b^{2})=2 gcd(a^{2}, b^{2}).
Since gcd(a, b)=1, it follows that gcd(a^{2}, b^{2})=1.
Thus, 2 gcd(a^{2}, b^{2})=2(1)=2, and so d##\mid##2,
which implies that d=1 or d=2.
Therefore, gcd(a+b, a^{2}+b^{2})=1 or 2.
(d) Suppose that gcd(a, b)=1.
Let d=gcd(a+b, a^{2}-ab+b^{2}).
By definition of the greatest common divisor,
we have that d##\mid##(a+b) and d##\mid##(a^{2}-ab+b^{2}).
This means d##\mid##[(a+b)^{2}-(a^{2}-ab+b^{2})], so we have d##\mid##3ab.
Note that each prime divisor k of d must divide either 3, a or b.
Thus, we have that d##\mid##[3a(a+b)-3ab] and d##\mid##[3b(a+b)-3ab],
so d##\mid##3a^{2} and d##\mid##3b^{2}.
Then we get d##\leq##gcd(3a^{2}, 3b^{2})=3 gcd(a^{2}, b^{2}).
Since gcd(a, b)=1, it follows that gcd(a^{2}, b^{2})=1.
Thus, 3 gcd(a^{2}, b^{2})=3(1)=3, and so d##\mid##3,
which implies that d=1 or d=3.
Therefore, gcd(a+b, a^{2}-ab+b^{2})=1 or 3.
 
Physics news on Phys.org
Math100 said:
Homework Statement:: Assuming that gcd(a, b)=1, prove the following:
(a) gcd(a+b, a-b)=1 or 2.
[Hint: Let d=gcd(a+b, a-b) and show that d##\mid## 2a, d##\mid##2b, and thus that d##\leq##gcd(2a, 2b)=2 gcd(a, b).]
(b) gcd(2a+b, a+2b)=1 or 3.
(c) gcd(a+b, a^{2}+b^{2})=1 or 2.
[Hint: a^{2}+b^{2}=(a+b)(a-b)+2b^{2}.]
(d) gcd(a+b, a^{2}-ab+b^{2})=1 or 3.
[Hint: a^{2}-ab+b^{2}=(a+b)^{2}-3ab.]
Relevant Equations:: None.

Note that d≤gcd(3a, 3b)=3 gcd(a, b).
Since gcd(a, b)=1, it follows that 3 gcd(a, b)=3(1)=3.
Thus, d∣3, which implies that d=1 or d=3.
b) From the first and second line, we can conclude ##d \le 3##. But that doesn't imply ##d## divides ##3##. But this can be fixed by changing the first line to "Note that ##d## divides ##\gcd(3a,3b) = 3\gcd(a, b)##". You may want to make a similar change in a, c, d) even though it kind of works out in a) and c) case.

Math100 said:
Homework Statement:: Assuming that gcd(a, b)=1, prove the following:
(a) gcd(a+b, a-b)=1 or 2.
[Hint: Let d=gcd(a+b, a-b) and show that d##\mid## 2a, d##\mid##2b, and thus that d##\leq##gcd(2a, 2b)=2 gcd(a, b).]
(b) gcd(2a+b, a+2b)=1 or 3.
(c) gcd(a+b, a^{2}+b^{2})=1 or 2.
[Hint: a^{2}+b^{2}=(a+b)(a-b)+2b^{2}.]
(d) gcd(a+b, a^{2}-ab+b^{2})=1 or 3.
[Hint: a^{2}-ab+b^{2}=(a+b)^{2}-3ab.]
Relevant Equations:: None.

By definition of the greatest common divisor,
we have that d∣(a+b) and d∣(a^{2}+b^{2}).
This means d∣[(a^{2}+b^{2})+(a^{2}-b^{2})] and d∣[(a^{2}+b^{2})-(a^{2}-b^{2})]
c) Maybe I'm just being slow, but i think it would be helpful to include a line "Since ##d \vert (a+b)##, we have ##d \vert (a+b)(a-b) = a^2 - b^2##. Other than that, i think everything else is correct.
Math100 said:
Homework Statement:: Assuming that gcd(a, b)=1, prove the following:
(a) gcd(a+b, a-b)=1 or 2.
[Hint: Let d=gcd(a+b, a-b) and show that d##\mid## 2a, d##\mid##2b, and thus that d##\leq##gcd(2a, 2b)=2 gcd(a, b).]
(b) gcd(2a+b, a+2b)=1 or 3.
(c) gcd(a+b, a^{2}+b^{2})=1 or 2.
[Hint: a^{2}+b^{2}=(a+b)(a-b)+2b^{2}.]
(d) gcd(a+b, a^{2}-ab+b^{2})=1 or 3.
[Hint: a^{2}-ab+b^{2}=(a+b)^{2}-3ab.]
Relevant Equations:: None.

Note that each prime divisor k of d must divide either 3, a or b.
Thus, we have that d∣[3a(a+b)-3ab] and d∣[3b(a+b)-3ab]
d) I don't think the first line implies the second, i.e., if a prime ##p## divides ##b## but not ##a## or ##3##, then we have ##p## doesn't divide ##3a(a+b) - 3ab##. However, the second line is true since ##d \vert (a+b)## and ##d \vert 3ab##. Other than that, and changing ##d \le 3## to ##d \vert 3##, it looks good to me.
 
I revised this problem, can you please review/verify to see if it's correct this time?

Proof:
(a) Suppose that gcd(a, b)=1.
Let d=gcd(a+b, a-b).
By definition of the greatest common divisor,
we have that d##\mid##(a+b) and d##\mid##(a-b).
This means d##\mid##[(a+b)+(a-b)] and d##\mid##[(a+b)-(a-b)],
so we have d##\mid##2a and d##\mid##2b.
Note that d divides gcd(2a, 2b)=2 gcd(a, b).
Since gcd(a, b)=1, it follows that 2 gcd(a, b)=2(1)=2.
Thus, d##\mid##2, which implies that d=1 or d=2.
Therefore, gcd(a+b, a-b)=1 or 2.
(b) Suppose that gcd(a, b)=1.
Let d=gcd(2a+b, a+2b).
By definition of the greatest common divisor,
we have that d##\mid##(2a+b) and d##\mid##(a+2b).
This means d##\mid##[2(2a+b)-(a+2b)] and d##\mid##[-(2a+b)+2(a+2b)],
so we have d##\mid##3a and d##\mid##3b.
Note that d divides gcd(3a, 3b)=3 gcd(a, b).
Since gcd(a, b)=1, it follows that 3 gcd(a, b)=3(1)=3.
Thus, d##\mid##3, which implies that d=1 or d=3.
Therefore, gcd(2a+b, a+2b)=1 or 3.
(c) Suppose that gcd(a, b)=1.
Let d=gcd(a+b, a^{2}+b^{2}).
By definition of the greatest common divisor,
we have that d##\mid##(a+b) and d##\mid##(a^{2}+b^{2}).
Since d##\mid##(a+b), we have d##\mid##(a+b)(a-b)=a^{2}-b^{2}.
This means d##\mid##[(a^{2}+b^{2})+(a^{2}-b^{2})] and d##\mid##[(a^{2}+b^{2})-(a^{2}-b^{2})],
so we have d##\mid##2a^{2} and d##\mid##2b^{2}.
Note that d divides gcd(2a^{2}, 2b^{2})=2 gcd(a^{2}, b^{2}).
Since gcd(a, b)=1, it follows that gcd(a^{2}, b^{2})=1.
Thus, 2 gcd(a^{2}, b^{2})=2(1)=2, and so d##\mid##2,
which implies that d=1 or d=2.
Therefore, gcd(a+b, a^{2}+b^{2})=1 or 2.
(d) Suppose that gcd(a, b)=1.
Let d=gcd(a+b, a^{2}-ab+b^{2}).
By definition of the greatest common divisor,
we have that d##\mid##(a+b) and d##\mid##(a^{2}-ab+b^{2}).
This means d##\mid##[(a+b)^{2}-(a^{2}-ab+b^{2})],
so we have d##\mid##3ab.
Now we have that d##\mid##[3a(a+b)-3ab] and d##\mid##[3b(a+b)-3ab],
so d##\mid##3a^{2} and d##\mid##3b^{2}.
Note that d divides gcd(3a^{2}, 3b^{2})=3 gcd(a^{2}, b^{2}).
Since gcd(a, b)=1, it follows that gcd(a^{2}, b^{2})=1.
Thus, 3 gcd(a^{2}, b^{2})=3(1)=3, and so d##\mid##3,
which implies that d=1 or d=3.
Therefore, gcd(a+b, a^{2}-ab+b^{2})=1 or 3.

I am so sorry for the late response of this problem. I was too busy in my workplace for the past two days. Now this problem is revised. Please leave any comment or feedback. I want to know if it's good or not.
 
  • Like
Likes fishturtle1
Math100 said:
I revised this problem, can you please review/verify to see if it's correct this time?

Proof:
(a) Suppose that gcd(a, b)=1.
Let d=gcd(a+b, a-b).
By definition of the greatest common divisor,
we have that d##\mid##(a+b) and d##\mid##(a-b).
This means d##\mid##[(a+b)+(a-b)] and d##\mid##[(a+b)-(a-b)],
so we have d##\mid##2a and d##\mid##2b.
Note that d divides gcd(2a, 2b)=2 gcd(a, b).
Since gcd(a, b)=1, it follows that 2 gcd(a, b)=2(1)=2.
Thus, d##\mid##2, which implies that d=1 or d=2.
Therefore, gcd(a+b, a-b)=1 or 2.
(b) Suppose that gcd(a, b)=1.
Let d=gcd(2a+b, a+2b).
By definition of the greatest common divisor,
we have that d##\mid##(2a+b) and d##\mid##(a+2b).
This means d##\mid##[2(2a+b)-(a+2b)] and d##\mid##[-(2a+b)+2(a+2b)],
so we have d##\mid##3a and d##\mid##3b.
Note that d divides gcd(3a, 3b)=3 gcd(a, b).
Since gcd(a, b)=1, it follows that 3 gcd(a, b)=3(1)=3.
Thus, d##\mid##3, which implies that d=1 or d=3.
Therefore, gcd(2a+b, a+2b)=1 or 3.
(c) Suppose that gcd(a, b)=1.
Let d=gcd(a+b, a^{2}+b^{2}).
By definition of the greatest common divisor,
we have that d##\mid##(a+b) and d##\mid##(a^{2}+b^{2}).
Since d##\mid##(a+b), we have d##\mid##(a+b)(a-b)=a^{2}-b^{2}.
This means d##\mid##[(a^{2}+b^{2})+(a^{2}-b^{2})] and d##\mid##[(a^{2}+b^{2})-(a^{2}-b^{2})],
so we have d##\mid##2a^{2} and d##\mid##2b^{2}.
Note that d divides gcd(2a^{2}, 2b^{2})=2 gcd(a^{2}, b^{2}).
Since gcd(a, b)=1, it follows that gcd(a^{2}, b^{2})=1.
Thus, 2 gcd(a^{2}, b^{2})=2(1)=2, and so d##\mid##2,
which implies that d=1 or d=2.
Therefore, gcd(a+b, a^{2}+b^{2})=1 or 2.
(d) Suppose that gcd(a, b)=1.
Let d=gcd(a+b, a^{2}-ab+b^{2}).
By definition of the greatest common divisor,
we have that d##\mid##(a+b) and d##\mid##(a^{2}-ab+b^{2}).
This means d##\mid##[(a+b)^{2}-(a^{2}-ab+b^{2})],
so we have d##\mid##3ab.
Now we have that d##\mid##[3a(a+b)-3ab] and d##\mid##[3b(a+b)-3ab],
so d##\mid##3a^{2} and d##\mid##3b^{2}.
Note that d divides gcd(3a^{2}, 3b^{2})=3 gcd(a^{2}, b^{2}).
Since gcd(a, b)=1, it follows that gcd(a^{2}, b^{2})=1.
Thus, 3 gcd(a^{2}, b^{2})=3(1)=3, and so d##\mid##3,
which implies that d=1 or d=3.
Therefore, gcd(a+b, a^{2}-ab+b^{2})=1 or 3.

I am so sorry for the late response of this problem. I was too busy in my workplace for the past two days. Now this problem is revised. Please leave any comment or feedback. I want to know if it's good or not.
No need to apologize; the proofs look correct to me.
 
Thank you so much for the help!
 
First, I tried to show that ##f_n## converges uniformly on ##[0,2\pi]##, which is true since ##f_n \rightarrow 0## for ##n \rightarrow \infty## and ##\sigma_n=\mathrm{sup}\left| \frac{\sin\left(\frac{n^2}{n+\frac 15}x\right)}{n^{x^2-3x+3}} \right| \leq \frac{1}{|n^{x^2-3x+3}|} \leq \frac{1}{n^{\frac 34}}\rightarrow 0##. I can't use neither Leibnitz's test nor Abel's test. For Dirichlet's test I would need to show, that ##\sin\left(\frac{n^2}{n+\frac 15}x \right)## has partialy bounded sums...