Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Discrete Math irrational and rational numbers proof

  1. Sep 18, 2011 #1
    1. The problem statement, all variables and given/known data

    Prove by contradiction. Your proof should be based only on properties of the integers, simple algebra, and the definition of rational and irrational.

    If a and b are rational numbers, b does not equal 0, and r is an irrational number, then a+br is irrational.

    2. Relevant equations

    rational numbers are equal to the ratio of two other numbers

    3. The attempt at a solution

    I wrote a proof but am not sure it is correct. Please tell me what I did wrong and show me the way to do it right if this is not correct. My teacher indicated that we need to make use of the fact that b does not equal 0 (from a+br). Did I do that sufficiently as well?:

    Proof: Suppose not. That is suppose that there exists rational numbers a and b, b does not equal zero, and irrational number r such that a+br is rational [We must deduce a contradiction].
    By definition of rational, a = c/d, b= e/f , a+br = g/h for some integers c,d,e,f,g,and h with h,f,d, and b not equal to 0.
    By substitution, a+b(r) = c/d +(r)( e/f) = g/h.
    Solving for r gives: r = (fgd-chf) / (ehd)
    Now fgd and chf are integers (being products of integers) and ehd does not equal 0 (by zero product property). Thus by definition of rational, r is rational which contradicts the supposition that r is irrational [ Hence the supposition is false and the statement is true].
  2. jcsd
  3. Sep 18, 2011 #2


    User Avatar
    Homework Helper

    Looks pretty solid. A few minor things I found:

    1) You want to state when you're defining b that e is also not equal to 0.

    2) You probably want to list your steps for solving for r, so your teacher doesn't need to go do extra work and check if it's right.
  4. Sep 18, 2011 #3
    Thanks for the help! I'll correct those things ASAP before handing it in...
  5. Sep 18, 2011 #4

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    Since you have already answered the question, I don't mind pointing out some shortcuts. Let c = a + br, and assume a, b and c are rational, and that b =/= 0. Then we can solve for r: r = (c-a)/b (legal because b is not zero). The numerator c-a is rational and the denominator b is rational, so their ratio is rational. This contradicts the assumption that r is irrational. Thus, c must also be irrational.

Share this great discussion with others via Reddit, Google+, Twitter, or Facebook