Calculus theory proof- Suppose a is irrational, prove√(1+a) is irrational.

math-help-me
Messages
3
Reaction score
0

Homework Statement



Suppose a is irrational, prove√(1+a) is irrational.


Homework Equations



A number is rational if it can be expressed as p/q, p,q integers with q≠0

The Attempt at a Solution



I can reason through it intuitively but not sure how to demonstrate it formally. Any help or advice would be greatly appreciated
 
Physics news on Phys.org
I think you can use a proof by contradiction here. Assume you can write sqrt(1 + a) as p/q with p and q integers and see what happens.
 
that was sort of my attempt at a solution. This is what I've got now.
Assume a irrational, but √(1+a) is rational.
Then √(1+a) = p/q for p,q integers q≠0.
1+a = p^2/q^2 → a = p^2/q^2 - 1 → (p^2- q^2)/q^2
Since p,q integers, p^2-q^2 and q^2 must be integers.
Thus a must also be rational by definition, a contradiction.
Thus if a is rational, √(1+a) is rational. The contrapositive is equivalent.
Therefore if a is irrational, √(1+a) is irrational.

Is this all valid and a valid conclusion? Thanks again!
 
math-help-me said:
that was sort of my attempt at a solution. This is what I've got now.
Assume a irrational, but √(1+a) is rational.
Then √(1+a) = p/q for p,q integers q≠0.
1+a = p^2/q^2 → a = p^2/q^2 - 1 → (p^2- q^2)/q^2
Since p,q integers, p^2-q^2 and q^2 must be integers.
Thus a must also be rational by definition, a contradiction.
Correct up to this point.

Thus if a is rational, √(1+a) is rational.
No, it's the other way around. If \sqrt{1+a} is rational, then a is rational. This is exactly what you just proved.

And therefore:
The contrapositive is equivalent.
Therefore if a is irrational, √(1+a) is irrational.!
 
suppose that a was irrational, but √(1+a) = r was rational.

what can you say, then, about r2?
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...

Similar threads

Back
Top