I have no idea how to go about this, except that we are supposed to use proof by contradiction to show that the square root of 3 is an irrational number. Any help or tips is appreciated. Thanks.
A proof by contradicts works by first assuming what you wish to show is false. Thus assume that the square root of 3 is rational. Then you can write: [tex]\sqrt{3} = \frac{p}{q}[/tex] where p and q are integers with no factors in common (and q non-zero). See if you can derive a contradiction from this (HINT: see if you can find a common factor which would be a contradiction).
What is interesting is the proof seems to originate with the Greeks with their proof that the square root of 2 was irrational, and, as they thought, "unutterable." But is seem that, historically, the proof was not followed up immediately for other integers. Taking some time to arrive at the irrationality of the square root of 17! I quote: http://www.cut-the-knot.org/proofs/sq_root.shtml: Plato tells us,Theaetetus: Theodorus was proving to us a certain thing about square roots, I mean the square roots of three square feet and five square feet, namely, that these roots are not commensurable in length with the foot-length, and he proceeded in this way, taking each case in turn up to the root of seventeen square feet; at this point for some reason he stopped. Now it occurred to us, since the number of square roots appeared to be unlimited, to try to gather them into one class, by which we could henceforth describe all the roots. Socrates: And did you find such a class? Theaetetus: I think we did.
Moderator's note: thread moved from "Number Theory". This sounds, looks, and smells like a homework problem. Please, no more help or hints until the OP responds and shows how far they have progressed towards a solution.
Hehe! I just wanted to open almost a same kind of topic. I don't really get proving by contradiction. What if I say, prove that square root of 4 is an irrational number. If my answer comes out as true, it is a rational number(contradiction assumption) then it is not an irrational. But by using statements such as they are relative primes I can get every square number to be irrational. Or because they are perfect squares we know they are rational? If someone can tell/prove me by using the proof of contradiction that the square root of 4 is not an irrational I will make him/her a chocolate cake which I'll eat. <3, Icelove
Assume 4 is an irrational. Then by definition, it cannot be written as a fraction. That is: [tex]\forall u,v\in Z, (u,v)=1, 4\not=\frac{u}{v}.[/tex] However, [tex]4=\frac{4}{1}[/tex] thus contradicting the assumption that 4 is an irrational.
Now I got the aaaaaah!(not yawning, or pain or others things) moment. I love you. :) ps: Although you made a tiny mistake since _square root_ of 4 is the question, so I guess just evaluate and 2 = 2/1.
2^2 = 4. So 2 is the square root of 4, and this is rational. However I feel that you main difficulty lies in understanding why the usual proof that sqrt(2) is irrational doesn't show that sqrt(4) is irrational, so I'll show where the proof falls apart. You cannot show that. The usual proof for 2 tries to write sqrt(2)=p/q with p and q relatively prime. You then manipulate it to get 2q^2 = p^2. Now key thing to note here is that you get 2|p^2, and since 2 is square-free and gcd(p,q)=1 you get 2|p. However this kind of logic doesn't work with 4 because 4|p^2 does not imply 4|p. Given a larger non-square number such as [itex]12 = 2^2 3[/itex] we still can't show 12|p^2 imply 12|p, but the proof only needs one factor so we instead use 12|p^2 imply 3 | p^2 imply 3|p imply 9|p^2, so 24|p^2.
Suppose [itex]\sqrt{3}[/itex] were rational. Then [itex]\sqrt{3}= a/b[/itex] for integers a and b, reduced to lowest terms (a and b have no common factors). From that, [itex]b\sqrt{3}= a[/itex] and so [itex]3b^2= a^2[/itex]. a must be of the form "3n" (a multiple of 3) or "3n+1" (one more than a multiple of 3) or "3n+2" (two more than a multiple of 3). If a= 3n+ 1, then [itex]a^2= (3n+1)^2= 9n^2+ 6n+ 1= 3(3n^2+ 2n)+ 1[/itex], NOT a multiple of 3. If a= 3n+ 2, then [itex]a^2= (3n+2)^3= 9n^2+ 12n+ 1= 3(3n^2+ 4n)+ 1[/itex], NOT a multiple of 3. That is, since [itex]a^2= 3b^2[/itex], a multiple of 3, a itself must be a multiple of 3: a= 3n. Now we have that [itex]a^2= (3n)^2= 9n^2= 3b^2[/itex] and [itex]b^2= 3n^2[/itex] is also a multiple of 3. From the above, applied to b rather than a, it follows that b is also a multiple of 3, contradictiong the fact that a and b had no common factors.