The methods of proving irrational have always been bothering me in my study of proof. It seems that for each case a new method has to be invented out of the blue. I understand only the proof that ##\sqrt{k}## is irrational. But what will happen if I want to prove ##\sqrt{2}+\sqrt{5}## or ##\sqrt{6}-\sqrt{5}##? Is it enough just to show each of them is irrational? Clearly it's not enough considering ##\sqrt{2} - \sqrt{2}## is rational. Could you guys help me in giving hint on how to prove them? It just doesn't seem obvious for me, at least for now. Most of the textbooks that I have read simply assume that once they have shown that square root of 2 is irrational, then the method can be applied to other form of irrationals. Thank You
That should be in any book on arithmetic. It is a combination of several facts. Some linear algebra is also helpful. Consider the minimal polynomial of the number. One way to think about it is to think of the number as a (linear) operator on a ring extension of the integers. For example for sqrt(2)+sqrt(5) we have a basis {1,sqrt(2),sqrt(5),sqrt(10)} and an action (sqrt(2)+sqrt(5))(1)=sqrt(2)+sqrt(5) (sqrt(2)+sqrt(5))(sqrt(2))=2+sqrt(10) (sqrt(2)+sqrt(5))(-(5))=5+sqrt(10) (sqrt(2)+sqrt(5))(sqrt(10))=5sqrt(2)+2sqrt(5) your example sqrt(2)-sqrt(2) fails since sqrt(2) and sqrt(2) are linearly dependent so we have a monic characteristic polynomial it is x^4-14 x^2+9 but that is not important now we know x is an algebraic integer we know or easily show that x is not a rational integer thus x is irrational
There's no one trick that's gonna work all of the time. In the case of algebraic numbers, like the ones you have listed, you can find a polynomial for which that number is a root and then use the rational roots theorem to find out which rational number(s) it could be and check. For instance, ##\sqrt{2}+\sqrt{5}## is a root of ##x^4-14x^2+9##. The only potential rational roots of that polynomial are ##\pm1##, ##\pm3## and ##\pm9##. It's not incredibly hard to show that ##3<\sqrt{2}+\sqrt{5}<6##, and so it can't be one of the possible rational roots. Thus it's irrational. Of course finding an appropriate polynomial and figuring out how to check whether your number is one of the possible roots is easier said than done in most cases.
Ok I understand that you can use the rational root theorem to show that x is not rational, it satisfies a polynomial but itself is not a factor of the last integer coefficient. Is this last fact enough, or do we have to show ##3<\sqrt{2}+\sqrt{5}<6##? In either case, how do you make that inequality?
Well if you want to use my strategy, and you're trying to show that your number is not rational, then you need to find some way to show that it's not one of the possible rational roots (given, again, by the rational roots theorem). In this particular case, the easist way to demonstrate that is, in my opinion, using the obvious (?) facts that ##1<\sqrt{2}<2## and ##2<\sqrt{5}<3##. So the inequality that I should have used if I were thinking a bit more clearly is ##3<\sqrt{2}+\sqrt{5}<5##, but the one that I did use is still true and still (at this point I hope) obvious.
I can see that it's true and I have no objection to it but if possible I'd like to see the fact you used for that ##1<\sqrt{2}<2##? Is it because if ##a < b## then ##a^2 < ab < b^2##?
Here's a neat way to prove that ##x = \sqrt{a} + \sqrt{b}## is irrational for integers a and b, ##a \neq b## and at least one of the two square roots is irrational (let that be ##\sqrt{a}##, without loss of generality). Let ##y = \sqrt{a} - \sqrt{b}##. Now consider ##xy = a - b##. Clearly ##xy## is integral and non-zero. That means that either both x and y are rational (case 1) or both are irrational (case 2). (to see why that's the case, say x is irrational and y is rational and xy = n, an integer. That means y = p/q. But then x = nq/p, which is clearly rational. This is a contradiction). Let's now consider ##x+y = 2\sqrt{a}##. This is clearly irrational. Hence case 1 is ruled out, and therefore both x and y are irrational. (QED) I know this doesn't exactly answer your question (you were probably looking for a more general method to solve a wider class of problems), but I hope this has given you some insight.
Does the following work? Suppose √2 + √5 = p/q for some integers p and q. Then 2 + 2√10 + 5 = p^{2}/q^{2}. It follows that 2√10 = p^{2}/q^{2} - 7 = r/s for some integers r and s. Thus, √10 = r/2s = m/n for some integers m and n. But it is easy to show that √10 is irrational. Thus, our conclusion is false, so we are forced to accept that √2 + √5 is irrational.
This is a nice idea, you're using xy and x+y to show that both are irrational and hence the original construction is irrational. That proof and Vashtek's It inspires me to do something like this: ##\sqrt{2}+\sqrt{5}=\frac{a}{b}## ##7+\sqrt{10}=\frac{a^2}{b^2}## Now we know ##7## is rational, and if we already know ##\sqrt{10}## is irrational, then by the additive closure of ##\mathbb{Q}##, ##\frac{a^2}{b^2} \not\in \mathbb{Q}## How about that? Does that imply ##\frac{a}{b} \not\in \mathbb{Q}## as well? I suppose that is so if we also consider the multiplicative closure.
I suppose so. Proof: Suppose p^{2} is irrational. Suppose p is rational. Then, p = a/b for some integers a and b. Therefore, we can conclude that p^{2}= a^{2}/b^{2}, which is clearly rational. But that clearly contradicts our initial assumption that p^{2} is irrational. Thus, we conclude p is irrational.
Yes, that would work. But my method is more general. For example, it allows the proof of the irrationality of ##\sqrt{18} + \sqrt{2}## at once. However, with your method, you end up with ##\frac{a^2}{b^2} = 32##. You can't immediately conclude that ##\sqrt{32}## is irrational, unless you've proven it (although I suppose expressing it as ##4\sqrt{2}## affords a quick proof based on your previously having established the irrationality of ##\sqrt{2}##). Of course, you could prove a far more general result - that the square root of a natural number is always irrational except when the number is a perfect square.