Prove that if a² + ab + b² = 0 then a = 0 and b = 0

  • Thread starter Thread starter CynicusRex
  • Start date Start date
  • Tags Tags
    Algebra Proof
Click For Summary
The discussion centers on proving that if a² + ab + b² = 0, then both a and b must equal zero. The initial attempts at proof involve manipulating the equation but ultimately fail to reach a conclusive argument. A more effective approach suggests considering the non-negativity of a² and b², leading to the conclusion that if either a or b is non-zero, the equation cannot hold. The conversation also touches on the importance of clarity in proofs and the need for a structured approach to avoid logical missteps. Overall, the key takeaway is that the only solution to the equation is a = 0 and b = 0.
  • #31
Try a proof by contradiction. The original statement is this: If ##a^2 + ab + b^2 = 0##, then a = 0 and b = 0.
Suppose the hypothesis is true, but the conclusion is false, i.e., that If ##a^2 + ab + b^2 = 0##, then it's not true that a = 0 and b = 0.
The negation of the conclusion is equivalent to ##a \ne 0 \text{ or } b \ne 0##, by De Morgan's law.
The assumptions that ##a \ne 0## or ##b \ne 0## can be handled in four cases.
1. a < 0, b ≥ 0
##a^2 + ab + b^2 = 0 \Leftrightarrow a^2 + 2ab + b^2 = ab \Leftrightarrow (a + b)^2 = ab##
Show that this assumption leads to a contradiction.
2. a > 0, b ≥ 0
##a^2 + ab + b^2 = 0 \Leftrightarrow a^2 + b^2 = -ab##
Show that this assumption also leads to a contradiction
Cases 3 and 4 are similar, but in these cases, we make assumptions about b being negative or nonnegative.

By the way, in your work you seem to think that a and -a must necessarily be positive and negative, respectively. That's not true. If a = -3, then a < 0 while -a > 0.
 
  • Like
Likes QuantumQuest
Physics news on Phys.org
  • #32
$$0=a^2 + ab + b^2$$

Complete the square for the first two terms on the right side.
 
  • Like
Likes TSny and PeroK
  • #33
You can use a proof by contrapositive. Your statement is of the form if ##P## then ##Q##. This is a conditional statement. The contrapositive is: if ## not~ Q## then ##not~ P##. Conditional statements are equivalent to their contrapositives.
##Q## is of the form (##A## and ##B##) where ##A## is ##a=0## and ##B## is ##b=0##. ##not~ Q## is ##not~(A## and ##B)## which is also (##not~ A## or ##not~ B##) according to De Morgan's laws.
So ##not~ Q## is ##a\neq0## or ##b\neq0##
NB ##not~ A## is ##a\neq0## since the negation of ##a## equals ##0## is ##a## is not equal to ##0##. You do the same for ##not~ B##, and ##not~ P##

Proof:
You can rewrite the statement as: if ##a\neq0## or ##b\neq0## then ##a^2 +ab +b^2 \neq0##
Now ##a\neq0## or ##b\neq0## means at least one of them is non-zero. So if you pick one to be ##0## and the other to be a non-zero number ##c## you would get:
##c^2 + 0c + 0^2##
##=c^2## which is a non-zero number.

So this proves it.
 
  • #34
I suppose it's a matter of taste, but I prefer to bound things over most 'contra' proofs.

Via ##GM \leq AM## or Cauchy's Inequality or playing around with ##(a-b)^2 \geq 0##, while dealing in ##\mathbb R##, we know real numbers squared are always non-negative. We put this together and get the below bound

##a^2 + b^2 = \big \vert a^2\big \vert + \big \vert b^2 \big \vert = \big \vert a^2 + b^2 \big \vert = \big \vert ab \big \vert \leq \big \vert \frac{1}{2}\big(a^2 + b^2\big) \big \vert = \frac{1}{2}\big(\big \vert a^2\big \vert + \big \vert b^2 \big \vert\big)= \frac{1}{2}\big(a^2 + b^2\big)##

subtract ##\frac{1}{2}\big(a^2 + b^2\big)## from each side and get

##\frac{1}{2}\big(a^2 + b^2\big) \leq 0##

and if the sum of two real non-negative numbers is zero, then both must be zero. (A fancier way of saying this would refer to positive definiteness but it's not really needed.)
 
  • #35
A year ago I thought I'd never become good at skateboarding because I kept failing at one simple trick. After lots of practice, something clicked this month, and things start to feel natural. I really hope this happens to proofs [math] as well, because I'm sure as hell feeling quite pathetic at the moment.
 
  • #36
George Jones said:
$$0=a^2 + ab + b^2$$

Complete the square for the first two terms on the right side.

Continuing on with this,

$$\begin{align}
0 &= a^2 + ab + b^2 \\
&= \left(a + \frac{b}{2} \right)^2 - \frac{b^2}{4} + b^2 \\
&= \left(a + \frac{b}{2} \right)^2 + \frac{3}{4} b^2 \\
& = x^2 + y^2
\end{align}$$

where ##x = \left(a + \frac{b}{2} \right)## and ##y = b \sqrt{3}/2##. Consequently, ##x = 0## and ##y = 0##, which gives ##b = 0## and ##a = 0##.
 
  • Like
Likes jim mcnamara
  • #37
I've posted this on Reddit and apparently my proof is sloppy, but not wrong:

"I'm explaining it very broadly because it's been too long that I've been struggling with this. This problem can be found in the book Algebra - I.M. Gelfand, A. Shen. Problem 124. I'm currently self-studying math, and never had a formal training or experience with proofs. Also, we can use determinants, or the factorization of a³-b³ to prove the problem, but those I understand. I do not understand why the one below is wrong. Thanks in advance, any help is appreciated and probably much needed.

Prove that if a²+ab+b² = 0, then a = 0 and b = 0.

a²+ab+b² = 0
a²+b² = -ab

Let's assume a and b are not 0. On the left hand side a² and b² are always positive, so their sum is also always positive. On the right hand side there are four possible situations:
  • 1. a and b are both positive (they have the same sign)
  • 2. a and b are both negative (they have the same sign)
  • 3. a is negative, b is positive (opposite signs)
  • 4. a is positive, b is negative (opposite signs)
Situation 1 and 2: if a and b have the same sign, the right hand side is negative:
1. a and b are positive: a²+b² = -ab
2. a and b are negative: (-a)²+(-b)² = -(-a)(-b)
-> a²+b² = -ab

Since the left hand side must be positive, the right hand side must be positive as well for the equation to hold. However, the right hand side must be negative in situation 1 and 2 so the equation does not hold. It only holds when a = b = 0.
Because if only a = 0, then 0²+b² = -0*b -> b² = 0 -> b = 0
and if only b = 0, then a²+0² = -a*0 -> a² = 0 -> a = 0

Situation 3 and 4: if a and b have the opposite sign, the right hand side is positive:
3. a is negative, b is positive: (-a)²+b² = -(-a)b
-> a²+b² = ab
4. a is positive, b is negative: (a)²+(-b)² = -a(-b)
-> a²+b² = ab

If the absolute values of a and b are not equal, then a>b or b>a, therefore the absolute value of the right hand side must be lower than the left hand side because respectively a²>ab and b²>ab.
If the absolute values of a and b are equal, then a=b, and a²+b²=ab becomes 2a² = a², in which evidently 2a²>a² when a is not 0.

So, in all four situations we have shown that the equation only holds when a = b = 0."

The reply:
You're being a bit sloppy with notation in (3) and (4). If a and b have the opposite sign then that doesn't mean that a2 + b2 = ab. I understand that on the right-hand side you're using a to mean -a (or b to mean -b) but that's rather poor practice.

Instead you should either use new variables (e.g. If a is negative, let r = -a so that r is positive and then r2 + b2 = rb) or just use absolute values: If a and b have opposite sign then |a|2 + |b|2 = |a||b|.

Other that that, I don't see a problem with the idea of your proof.
 
  • #38
zexxa said:
Well late to the party but I've been thinking about extending the domain of this question to including all numbers including complex numbers. This is my proof for it and I'd like to know if it is valid.

Given ##a^2+ab+b^2=0## ∀ ##a,b ∈ ℂ##
##a^2+b^2=-ab## ⇒(Realise that for this to be true only either ##a,b∈ℝ## or both ##a,b∉ℝ##)ab =<br /> \begin{cases}<br /> ≤0 &amp; \text{if } a,b ∈ ℝ \\<br /> ≥0 &amp; \text{if } a,b ∉ ℝ<br /> \end{cases}<br />
##(a+b)^2-2ab=-ab##
##(a+b)^2=ab##
Suppose ##a,b≠0##
##⇒(a+b)^2/(ab)=1##
Since ##(a+b)^2## and ##ab## have to be of the same sign, it is a contradiction with the case statement above and hence a,b cannot not be 0.

The result is false for complex numbers. Let ##\omega \neq 1## be a (complex) cube root of unity; that is, ##\omega^3 = 1##. If ##b = \omega a## we have $$a^2 + a b + b^2 = a^2 ( 1 + \omega + \omega^2) = 0$$
for any ##a##, because ##1 + \omega + \omega^2 = 0##.

You can verify all this explicitly by taking
$$ \omega = -\frac{1}{2} + i \frac{\sqrt{3}}{2} .$$
 
  • #39
It's easy to simplify this to ##(a+b)=\sqrt{ab}##. But extending AM-GM to include negative numbers (which is fine since this setup won't invalidate the underlying stipulations) tells us that ##(a+b) \geq 2\sqrt{ab}##. Substitution gives ##\sqrt{ab} \geq 2\sqrt{ab}##. The only time this is true is the equality case, i.e. ##\sqrt{ab}=0##. Thus at least one of ##a## and ##b## is zero. Solving for the other gives ##a^2+a(0)+(0)^2=0## so the other is zero as well.
 
  • #40
TheBlackAdder said:
Prove that if a²+ab+b² = 0, then a = 0 and b = 0.

a²+ab+b² = 0
a²+b² = -ab

Let's assume a and b are not 0. On the left hand side a² and b² are always positive, so their sum is also always positive. On the right hand side there are four possible situations:
  • 1. a and b are both positive (they have the same sign)
  • 2. a and b are both negative (they have the same sign)
  • 3. a is negative, b is positive (opposite signs)
  • 4. a is positive, b is negative (opposite signs)
Situation 1 and 2: if a and b have the same sign, the right hand side is negative:
1. a and b are positive: a²+b² = -ab
2. a and b are negative: (-a)²+(-b)² = -(-a)(-b)
-> a²+b² = -ab
Notwithstanding what someone on Reddit has to say, this is not a good proof. For one thing, it looks like you are concluding that ##a^2 + b^2 = -ab##. You should not conclude this -- you are assuming it is true. Since you are assuming that ##a^2 + ab + b^2 = 0##, it follows trivially that ##a^2 + b^2 = -ab##, regardless of whether a and b are positive, negative, of mixed signs, whatever. The two equations are equivalent. You don't need any machinery to get from one to the other, apart from adding equal quantities to both sides of either equation.

If a and b are both positive or both negative, then the equation ##a^2 + b^2 = -ab## can't possibly be true, and is therefore a contradiction. The left side is positive and the right side is negative. I mentioned this in my earlier post.

The worst part of your proof is that you seem to think putting a minus sign in front of a variable makes it negative. That is NOT true in general. For example, if x = -2, then -x is positive.
TheBlackAdder said:
Since the left hand side must be positive, the right hand side must be positive as well for the equation to hold. However, the right hand side must be negative in situation 1 and 2 so the equation does not hold. It only holds when a = b = 0.
Because if only a = 0, then 0²+b² = -0*b -> b² = 0 -> b = 0
and if only b = 0, then a²+0² = -a*0 -> a² = 0 -> a = 0
 
  • Like
Likes jim mcnamara, QuantumQuest and SammyS

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
Replies
4
Views
2K
Replies
4
Views
2K
  • · Replies 18 ·
Replies
18
Views
3K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 7 ·
Replies
7
Views
6K
Replies
8
Views
3K