Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Is there a proper universal proof for proving irrationality?

  1. Jun 10, 2007 #1
    :grumpy:OKAY, I AM REALLY AGAINST THE PROOF FOR THE IRRATIONALITY OF THE ROOT OF 2. Is there a proper universal proof for proving irrationality? I mean, my teacher used the same logic for proving irrationality of 3! All I see is that I can show that the square root of 4 or 9 are also irrational! :confused:

    Note: I searched the forum yet found nothing...
  2. jcsd
  3. Jun 10, 2007 #2

    matt grime

    User Avatar
    Science Advisor
    Homework Helper

    Why can you show that? I presume you mean 'reductio ad absurdum'. I'll bet if you try it for 4 or 9 you'll reach a sticky issue. So try it to see where the proof goes wrong. The proof is a good proof, and uses some interesting facts about primes (p is a prime if and only if p|ab implies p|a or p|b. The symbol | means divides without remainder).
  4. Jun 10, 2007 #3
    Allright, I can't find the root of 3 across the internet. So can you show me your known proof of irrationality of 3^(1/2)?
  5. Jun 10, 2007 #4


    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    Dearly Missed

    Not that "being irrational" is a denial of having the property "rational".
    Therefore, unless you can find some positive, constructive characterization of numbers equivalent to the negative definition as irrationals, then just about the "only" way to prove that a number is irrational is to deny it the property of being rational.
    But that is frightfully close to saying that "reductio ad absurdum" is THE way to prove irrationality of a number.
  6. Jun 10, 2007 #5
    This link is the only one I can find so far...
    I don't see why I cannot put any number in the place of 3... Say I put 4, would it not say that 4 is irrational?
  7. Jun 10, 2007 #6
    What happens when you replace 3 with 4? Can you think of a number p, such that p^2 is divisible by 4, but p is not?
  8. Jun 10, 2007 #7
    I am sorry but I do not entirely follow you. Also I don't entirely follow why that is significant to this proof - maybe this is where I have gone wrong?
    Last edited by a moderator: Jun 10, 2007
  9. Jun 10, 2007 #8
    This would have to be coupled with a mention of the mid-value theorem though.
  10. Jun 10, 2007 #9
    I still don't see how Moo of Doom has confirmed the validity of the proof for 3^(1/2)
  11. Jun 10, 2007 #10
  12. Jun 10, 2007 #11
    Well, what about 2? 2^2 = 4 is divisible by 4, but 2 is not divisible by 4. Thus, the proof fails.

    The proof boils down to this:

    Suppose the square root of 3 were rational. Then it could be written as a/b for some integers a and b. These integers might have some factor in common, so we simply divide both by their greatest common divisor to get the fraction in reduced form.

    But by applying a bit of algebra and some number theory, we see that under our assumption, both the numerator and the denominator would have to be divisible by 3, even though the fraction was assumed to be in reduced form. This is a contradiction, so our assumption must have been wrong: the square root of 3 is not rational.

    You cannot apply this argument to the square root of 4, because you simply can't use the same trick to derive that both the numerator and the denominator have a common divisor greater than 1.

    The trick?

    If a^2 is divisible by 3, then since 3 is prime, a must be divisible by 3 as well. A factor of 3 can't pop out of nowhere. If you multiply two integers and get a multiple of 3, one of the integers must be a multiple of 3.

    But just because a^2 is divisible by 4, we can't deduce that a is divisible by 4. It might not be. We only need a to be divisible by 2, since then a = 2k for some integer k, and thus a^2 = 2k*2k = 4k^2 is divisible by 4 (even though a isn't necessarily!).
  13. Jun 10, 2007 #12
    Well this is where I come into play... I say that this is because WE KNOW the exact values that 2*2 is 4. This proof does not cover the fact that there is no possible reoccuring pattern in 2^(1/2). There may as well be such a number. However there is not. THIS PROOF (as I see it) does not cover this....
  14. Jun 10, 2007 #13
  15. Jun 10, 2007 #14

    matt grime

    User Avatar
    Science Advisor
    Homework Helper

    What? I think you ought to take some time out to think about the proof little more closely. Where does anyone invoke the fact that 2*2=4? Recurring patterns? That is neither here nor there.

    If there is a rational number whose square is 2, we can deduce a contradiction, therefore there is no such number. The deduction fails for 4, as we have explained above.

    As it happens there is a constructive proof:

    suppose that r is some integer and r=a^2/b^2, or r*b^2=a^2. By uniqueness of prime decomposition, the RHS has even powers of all primes, and so must the LHS, thus r must be a perfect square.

    Shall we rehash the reductio ad absurdum proof? Then you can point out the precise step that confuses you.

    1. If 2=a^2/b^2 with (a,b)=1,

    2. then 2b^2=a^2, ]

    3. 2 divides the LHS,

    4. 2 divides a^2

    5. since 2 is prime, 2 must divide a.

    6. write a=2c

    7. 2b^2=4c^2 so b^2=2c^2

    8. 2 divides the RHS, so 2 divides b, contradicting our assumption that (a,b)=1

    STEP 5 fails if we were to use 4 instead of 2. We'd just have 4b^2=a^2, 2 still divides a, so a=2c, thus 4b^2=4c^2 or b=c (assuming both positive), thus 4=4, and there's no contradiction.
  16. Jun 10, 2007 #15
    THANK YOU. All sources I have seen omits step 5 in their reasoning. All you needed to show was that - saving me from all these nonsensical posts...
  17. Jun 10, 2007 #16

    matt grime

    User Avatar
    Science Advisor
    Homework Helper

    The reasoning of step 5 was mentioned in posts 2,6, and 11.
  18. Jun 10, 2007 #17
    Yes, thanks for pointing that out
  19. Jun 10, 2007 #18

    Gib Z

    User Avatar
    Homework Helper

    Writing all positive integers as a product of their prime factors we can see, this proof can be generalized to say that the square root of all positive integers that are not perfect squares is irrational. In fact similarly I think one could extend the to the nth-root of numbers..very fascinating in my opinion.
  20. Jun 11, 2007 #19


    User Avatar
    Science Advisor
    Homework Helper

    suppose you know that every rational number has a unique "lowest form".

    If sqrt(n) = p/q is in lowest form then so is n/1 = p^2/q^2. but then q^2 = 1, and p^2 = n, so n is a perfect square, unlike 2 or 3 or 12 or.......
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook