Why Does 2 Seem to Equal 3 in This Mathematical Scenario?

  • Thread starter Thread starter PrincePhoenix
  • Start date Start date
AI Thread Summary
The discussion centers around a flawed mathematical proof that claims 2 equals 3. The error arises from incorrectly manipulating the equation and assuming the equality of 2 and 3 without justification. Participants highlight that squaring both sides can introduce extraneous roots and that one cannot assume what is to be proven. The main fallacy is the initial assumption that 2 equals 3, which invalidates the proof. The conversation concludes with an acknowledgment that the original poster was seeking clarification rather than attempting to prove a point.
PrincePhoenix
Gold Member
Messages
116
Reaction score
2
Why is 2 = 3 over here?

Suppose 2=3

Then subtract 5/2 from both sides.
2 - 5/2 = 3 - 5/2
4-5/2 = 6-5/2
-1/2 = 1/2

Take square root
(-1/2)2 = (1/2)2
1/4 = 1/4

What's wrong? why is it proved that 2 = 3?
 
Last edited:
Mathematics news on Phys.org
You didn't prove anything. Note that when you square both sides you effectively multiply the sides by different numbers.
 
Maybe it is because I am not fully awake yet, but I do not follow your arithmetic.

2 - 5/4 = 3/4 not -1/2
3 - 5/4 = 7/4 not 1/2
 
Doc Al said:
You didn't prove anything. Note that when you square both sides you effectively multiply the sides by different numbers.
So you mean that taking square was wrong? I thought I did everything on both sides of the equation. I mean which step is incorrect?
 
pbandjay said:
Maybe it is because I am not fully awake yet, but I do not follow your arithmetic.

2 - 5/4 = 3/4 not -1/2
3 - 5/4 = 7/4 not 1/2
I corrected the mistake. Check again.
 
When you square an equation, you run the risk of creating extraneous roots. For example, -2^2 = 2^2 = 4 but this does not prove that -2 = 2. Also, another mistake you made is assuming what you're trying to prove. To have a correct proof you would need to begin with the equaltiy 1/4 = 1/4 and proceed to show that 2 = 3 (which you can't do).

Edit: Just so it's clear, the biggest fallacy in your "proof" was the assumption that 2 = 3. You can't assume what you're trying to prove.
 
To absolutely clear, here's why you can't assume what you're proving . . .

Let a,b \in \mathbb{R} and suppose that a = b. Then clearly we have that 0*a = 0*b = 0. Hence, if a and b are any two real numbers, then they are equal to each other (using your same logic). A correct proof would need to start at the equality 0 = 0 and work from there (which is impossible).
 
Thanks for the answers. I didn't want to PROVE anything here. This was just shown to me by a friend and I just wanted to know what was wrong.
 
Back
Top