Prove that sqrt(2) is irrational using a specific technique

  • Thread starter Thread starter MissMoneypenny
  • Start date Start date
  • Tags Tags
    Irrational Specific
Click For Summary

Homework Help Overview

The problem involves proving that √2 is irrational using a contradiction approach, specifically by assuming the existence of integers a and b (with b nonzero) such that (a/b)² = 2. The context is rooted in number theory, particularly in the exploration of rational and irrational numbers.

Discussion Character

  • Conceptual clarification, Assumption checking

Approaches and Questions Raised

  • Participants discuss how to establish that a and b can be assumed to be positive integers. Some explore the implications of the signs of a and b and the equivalence of cases where both are positive or negative.

Discussion Status

Participants are actively engaging with the problem, clarifying the reasoning behind the assumption of positivity for a and b. Guidance has been offered regarding the equivalence of cases and the concept of "without loss of generality" (wlog), which has helped some participants progress in their understanding.

Contextual Notes

There is an emphasis on the need to justify the assumption of positivity without proving it outright, as well as the exploration of different cases regarding the signs of a and b.

MissMoneypenny
Messages
17
Reaction score
0

Homework Statement



Prove that √2 is irrational as follows. Assume for a contradiction that there exist integers a, b with b nonzero such that (a/b)2=2.

1. Show that we may assume a, b>0.
2. Observe that if such an expression exists, then there must be one in which b is as small as possible.
3. Show that [itex]\left(\frac{2b-a}{a-b}\right)^2=2[/itex].
4. Show that 2b-a>0, a-b>0.
5. Show that a-b<b, a contradiction.

This problem is from Galois Theory, Third Edition by Ian Stewart.

My question is how can we show that a, b>0? Assuming that a and b are positive I've completed steps 2. through 5. Now all that I have left to do is show a and b are positive, which seems like it should be the simplest part. Nonetheless, I'm unclear how to do it.

Homework Equations



None

The Attempt at a Solution



Here's my line of thought so far. I'm not sure if it's correct.

First note that √2>1.
Now consider the possibilities for the signs of a and b. If exactly one of a or b is negative, then a/b<0. But we have (a/b)2=2, which would imply √2=a/b<0, which is false. If both a and b are positive or both a and b are negative then a/b>0, so (a/b)2=2 implies √2=a/b>0, which is true. It is therefore not valid to treat the case in which exactly one of a or b is negative, while it is valid to treat either the case where a, b<0 or when a, b>0 since they are equivalent.

I'm very unsure of what I have written up there. Any help or guidance is much appreciated.

Thanks!
 
Physics news on Phys.org
MissMoneypenny said:
Here's my line of thought so far. I'm not sure if it's correct.

First note that √2>1.
Now consider the possibilities for the signs of a and b. If exactly one of a or b is negative, then a/b<0. But we have (a/b)2=2, which would imply √2=a/b<0, which is false. If both a and b are positive or both a and b are negative then a/b>0, so (a/b)2=2 implies √2=a/b>0, which is true. It is therefore not valid to treat the case in which exactly one of a or b is negative, while it is valid to treat either the case where a, b<0 or when a, b>0 since they are equivalent.

I'm very unsure of what I have written up there. Any help or guidance is much appreciated.

Thanks!
First, there is no way to prove that a and b are positive. The question is why can you assume a and b are positive. You've done most of the work. All you have to do is say what to do if a, b are both negative.
 
To amplify Peroks' response- you are not asked to show that a and b must be positive, only that they can be positive.
 
Thanks for the help PeroK and Hallsoflvy. So to establish that a and b can be positive can I simply comment that if (a/b)2=2, then it is also true that (-a/b)2=2 and that (-a/(-b))2=2, so it doesn't matter whether we consider the case in which a and b are both positive, both negative, or in which one is positive and one is negative since all three cases are equivalent?
 
MissMoneypenny said:
Thanks for the help PeroK and Hallsoflvy. So to establish that a and b can be positive can I simply comment that if (a/b)2=2, then it is also true that (-a/b)2=2 and that (-a/(-b))2=2, so it doesn't matter whether we consider the case in which a and b are both positive, both negative, or in which one is positive and one is negative since all three cases are equivalent?

That's nearly it, although I still think you may be missing the key point. If there were such integers a' and b', then there would be positive integers a and b with the same property.

If you show that no such positive integers can exist, then no such integers can exist.

In this case, you could take a = |a'| and b = |b'|.

But, it's enough to say that if there are integers with the property a/b = √2, then there are positive integers a/b with this property. That's essentially the point.
 
Ah, that makes sense! I think I've got it now and can safely work on my solution, but if I run into any more questions I'll make another post. Thanks again for your help!
 
I just realized that the term you're looking for is wlog, "without loss of generality". A very useful concept!
 

Similar threads

Replies
5
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
Replies
1
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
Replies
14
Views
4K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
9
Views
2K
  • · Replies 15 ·
Replies
15
Views
6K
  • · Replies 18 ·
Replies
18
Views
3K