## Prove that between any two rational numbers there is at least one irrational number..

and, consequently, infinitely many.

I am new to proofs so could you please check if this proof is correct?

Let x be an irrational number in the interval In = [an, bn], where an and bn are both rational numbers, in the form p/q.

Let z be the distance between x and an, So:

x - an = x - p1/q1
= x + (-p1/q1)

But an irrational number (in this case x) + a rational number is also an irrational number. Therefore distance from an to x, which is z, is irrational.

The distance from x to the origin, is an + z. But again, an irrational number plus a rational number is also irrational. Therefore, there is always at least one rational number between any two rational numbers. However, the same proof can be applied to an infinite amount of subintervals within In, therefore there is an infinite amount of irrational numbers as well.

 Recognitions: Homework Help Science Advisor The first thing you did was to assume the result that you wanted to show. If you want to show that there is an irrational in an interval [a,b], then you can't assume there's one in there. There's no need to use the subscript a_n (or b_n). You can prove this result in many ways. An elegant way that I just thought of (it's not new, just not the one I first thought of) is to show that one can reduce the question to the case of showing there is an irrational in the interval [0,1].
 Well, in ~assuming~ the first step you seem to have a decent conclusion for the second. If we assume q, p does follow, but as MG points out you first assumed q to prove p, and then used p to justify q. (with q as "there exists at least one irrational number between a,b, and p as that there therefor exists infinitely many.)

## Prove that between any two rational numbers there is at least one irrational number..

 Quote by matt grime The first thing you did was to assume the result that you wanted to show. If you want to show that there is an irrational in an interval [a,b], then you can't assume there's one in there. There's no need to use the subscript a_n (or b_n). You can prove this result in many ways. An elegant way that I just thought of (it's not new, just not the one I first thought of) is to show that one can reduce the question to the case of showing there is an irrational in the interval [0,1].

I was afraid that was the mistake I made. Thanks for the suggestion.

 Okay, here is another try at me trying to prove that between any interval [a,b], there is at least one irrational number: Let there be an interval I = [0,1] An irrational number in I is 1/root2. 1) For any other interval that is not [0,1], but encompasses interval I, then 1/root2 is also an irrational number in that interval. 2) For any other interval [a,b] where 0 < a < 1 and 0 < b < 1 and a < b (basically a subinterval of interval I), 1/root2 must also be an irrational number in that interval because there are an infinite amount of subintervals within that interval. 3) For any other interval which is not [0,1], does not encompass [0,1], and is not a subinterval of [0,1] (the interval is "shifted" over), then let that interval be I_2 = [a,b], where a and b are both rational numbers in the form p/q. The distance between the left end point of I and I_2 is: a - 0 = a = p/q. Therefore, an irrational number in I_2 is 1/root2 + p/q. Therefore, within any interval [a,b], there is at least one irrational number. And, also, since there are an infinite amount of subintervals within any interval, with each subinterval having at least one irrational number, then there are an infinite amount of irrational numbers in any interval.

Recognitions:
Homework Help
 Quote by JG89 2) For any other interval [a,b] where 0 < a < 1 and 0 < b < 1 and a < b (basically a subinterval of interval I), 1/root2 must also be an irrational number in that interval because there are an infinite amount of subintervals within that interval.
1/sqrt(2) is about 0.7, right? So how does that lie in, say, the subinterval [0, 1/2]?

 Quote by matt grime 1/sqrt(2) is about 0.7, right? So how does that lie in, say, the subinterval [0, 1/2]?

I should say that for any interval [0,b], where b is < 1, then let there be a rational number p/q = |b - 1|.

An irrational number in the interval [0,b] is then 1/root2 - p/q.

The same thing can be applied to an interval where a is not 0 but b = 1

 Recognitions: Homework Help Science Advisor Hope you didn't see my solution I posted prematurely here. Why are you letting p/q = |b-1|? There's no need to write fractions, and by assumption 1-b is positive. Notice that if b=0.1, the 1-b=0.9, and 1/sqrt(2) -0.9 is not in the interval [0,0.1] as you claim: it isn't even positive.
 The traditional way is to use the Archimedean property. Say a is the first rational and b is the second with b > a. Then there is an n in N such that 1/n < b - a. Also, you can find an irrational number x < a. Then letting S = {x + k/n | k is an integer}, you see that there must be an element, y > x, in S such that y - b < 1/n, therefore, y - 1/n < b. And since b < y and 1/n < b -a, b - (b - a) < y - 1/n, which means a < y - 1/n. Therefore a < y - 1/n < b. Since y - 1/n is irrational, the proof is complete.
 What a mess! Thanks Matt and Werg for spending the time and helping me with this. This is something else I came up with, hopefully it works: root2 is an irrational number. There is always some constant k, which is rational, to which you could add to root2, resulting with a number, c, such that a < c < b for some other interval and c is irrational (since a rational + an irrational is an irrational).
 Ah silly of me, the method I explained is too strong : it's used to prove that between any two real numbers there is an irrational number.

 Quote by matt grime .... to show that there is an irrational in an interval [a,b] ... You can prove this result in many ways. An elegant way that I just thought of (it's not new, just not the one I first thought of) is to show that one can reduce the question to the case of showing there is an irrational in the interval [0,1].
 Quote by JG89 What a mess! Thanks Matt and Werg for spending the time and helping me with this. This is something else I came up with, hopefully it works: root2 is an irrational number. There is always some constant k, which is rational, to which you could add to root2, resulting with a number, c, such that a < c < b for some other interval and c is irrational (since a rational + an irrational is an irrational).
JG89,

You're getting closer, JG!

However, your last argument wold normally be considered more like "the idea of a proof" rather than a proof.

Here are two possiible next steps.

1) Take your argument above and flesh it out. For example, it would be fair to assume that it is "well known" that for all epsilon greater than zero, there exists a rational number f such that |x-f| < epsilon. This fact could be combined with your idea to make a real proof.

2) Try to find the "elegant" argument that Matt Grime suggested. Here's a hint. Write down the equation of a linear function f:R->R such that f(0)=a and f(1)=b.

DJ

 I thought about it a bit. Here is what I have now: Consider the interval I = [a,b], where both a and b are rational numbers. For all epsilon > 0, |x-f| < epsilon. Using x = root2 as an irrational number, |root2 - f| < epsilon. However, a < epsilon < b So, |root2 - f| < b |-f| < b - root2 f > root2 - b Considering the other end point, a: |root2 - f| > a |-f| > a - root2 f < root2 - a Therefore, for any rational value f, where (root2 - b) < f < (root2 - a), (root2 - f) will yield an irrational number in interval [a,b].
 Deacon, I have to think about this more for a while. But, could you please explain what was wrong with the proof that I just posted?

 Quote by JG89 I thought about it a bit. Here is what I have now: Consider the interval I = [a,b], where both a and b are rational numbers. For all epsilon > 0, |x-f| < epsilon. Using x = root2 as an irrational number, |root2 - f| < epsilon. However,
What you wrote does not make any sense.

Giving you a "kind interpretation," I'm guessing that what you meant was:

"Consider the interval I = [a,b], where both a and b are rational numbers. For all epsilon > 0, there exists a rational number "f" such that |x-f| < epsilon.

"Let x = root2. Let epsilon be a positive number between a and b. In other words, let epsilon be a number greater than zero such that a < epsilon < b."

Now what's wrong with that?

Well first, a and b might be less than zero, so, there might not be any positive numbers between them.

But, that doesn't really get to the heart of the matter.

Epsilon is supposed to be a "small" positive number because it is supposed to force "f" to be close to "root2".

"a" and "b" might be very large numbers. Your choice of epsilon might yield a very large value for epsilon.

Compare this with the choice that I suggest for epsilon. Sure, (b-a)/4 might be large, but I could have chosen epsilon = (b-a)/x for any x>4 and the proof I suggest will still work. I could have chosen epsilon to be very small no matter what the value of a and b are. Your choice does not have this property.

I didn't work through the rest of your argument. Better that you should come up with a correct proof and then try to find the mistake in your old proof.

It's not so easy always to pinpoint another's mistake. And your doing it yourself would be a lot more profitable for you than me doing it for you.

One more hint to help you find a correct argument. Go back and look at what i called your "outline" for a proof in the spirit of what I am calling "Option 1." Your outline was correct.

DJ

 $$a+\frac{\pi}{4}\left(b-a\right)$$