Square root in Q(root 2) means its in Z[root 2]

Click For Summary
SUMMARY

The discussion establishes that if \( a + b\sqrt{2} \) has a square root in \( \mathbb{Q}(\sqrt{2}) \), then this square root is also in \( \mathbb{Z}[\sqrt{2}] \). The proof involves expressing the square root as \( \frac{p + q\sqrt{2}}{r} \) and demonstrating that \( r \) must equal 1, thus ensuring \( p \) and \( q \) are integers. The analysis shows that any assumption of \( r \) having prime factors leads to contradictions, confirming the conclusion definitively.

PREREQUISITES
  • Understanding of algebraic structures, specifically \( \mathbb{Q}(\sqrt{2}) \) and \( \mathbb{Z}[\sqrt{2}] \)
  • Familiarity with rational and irrational numbers
  • Knowledge of basic number theory, including prime factorization
  • Ability to manipulate equations involving square roots and rational expressions
NEXT STEPS
  • Study the properties of algebraic integers in number fields
  • Learn about the structure of quadratic fields, particularly \( \mathbb{Q}(\sqrt{d}) \)
  • Explore the concept of unique factorization in rings of integers
  • Investigate the implications of the irrationality of square roots in algebraic equations
USEFUL FOR

Mathematicians, number theorists, and students studying algebraic number theory who are interested in the properties of square roots in quadratic fields.

caffeinemachine
Gold Member
MHB
Messages
799
Reaction score
15
Let $a,b \in \mathbb{Z}$, and if $a+b\sqrt{2}$ has a square root in $\mathbb{Q}(\sqrt{2})$, then the square root is actually in $\mathbb{Z}[\sqrt{2}]$.

Only one approach comes to my mind. Let $r_1, r_2 \in \mathbb{Q}$ such that $a+b\sqrt{2}=(r_1+r_2\sqrt{2})^2$. This gives $a=r_1^2+2r_2^2, b=2r_1r_2$. I need to somehow show that $r_1, r_2$ are integers. I played with the above equations putting $r_i=p_i/q_i$ with $\gcd (p_i,q_i)=1$. But I couldn't conclude anything.
 
Physics news on Phys.org
caffeinemachine said:
Let $a,b \in \mathbb{Z}$, and if $a+b\sqrt{2}$ has a square root in $\mathbb{Q}(\sqrt{2})$, then the square root is actually in $\mathbb{Z}[\sqrt{2}]$.

Only one approach comes to my mind. Let $r_1, r_2 \in \mathbb{Q}$ such that $a+b\sqrt{2}=(r_1+r_2\sqrt{2})^2$. This gives $a=r_1^2+2r_2^2, b=2r_1r_2$. I need to somehow show that $r_1, r_2$ are integers. I played with the above equations putting $r_i=p_i/q_i$ with $\gcd (p_i,q_i)=1$. But I couldn't conclude anything.
If $a+b\sqrt{2}$ has a square root in $\mathbb{Q}(\sqrt{2})$, then the square root can be written in the form $\dfrac{p+q\sqrt2}r$, where $p$, $q$ and $r$ are integers and $r$ is chosen to be positive and as small as possible (so that in particular the triple $p,\,q,\,r$ will have no common factor greater than 1).

Then $r^2(a+b\sqrt2) = (p+q\sqrt2)^2 = p^2+2q^2 + 2pq\sqrt2$, and therefore $p^2+2q^2 - r^2a = (r^2b-2pq)\sqrt2.$ But $\sqrt2$ is irrational, so no nonzero multiple of it can be an integer. Therefore $$p^2+2q^2 = r^2a, \qquad 2pq = r^2b.$$
Suppose that $r$ has an odd prime factor $\rho$. Then the second of those displayed equations shows that $\rho$ is a factor of either $p$ or $q$. The first of the displayed equations then shows that $\rho$ is a factor of both $p$ and $q$. Thus $p$, $q$ and $r$ have the common factor $\rho$, contrary to the initial assumption.

Next, suppose that $r$ is even, say $r=2s$. Then the first displayed equation becomes $p^2+2q^2 = 4s^2a$, showing that $p$ must be even, say $p=2t.$ It follows that $2t^2+q^2 = 2s^2a$, showing that $q$ is even. Thus $p$, $q$ and $r$ have the common factor 2, again contrary to the initial assumption.

The conclusion is that $r$ has no prime factors at all and is therefore equal to 1, proving that $a+b\sqrt{2}$ has a square root in $\mathbb{Z}[\sqrt{2}].$
 
Opalg said:
If $a+b\sqrt{2}$ has a square root in $\mathbb{Q}(\sqrt{2})$, then the square root can be written in the form $\dfrac{p+q\sqrt2}r$, where $p$, $q$ and $r$ are integers and $r$ is chosen to be positive and as small as possible (so that in particular the triple $p,\,q,\,r$ will have no common factor greater than 1).

Then $r^2(a+b\sqrt2) = (p+q\sqrt2)^2 = p^2+2q^2 + 2pq\sqrt2$, and therefore $p^2+2q^2 - r^2a = (r^2b-2pq)\sqrt2.$ But $\sqrt2$ is irrational, so no nonzero multiple of it can be an integer. Therefore $$p^2+2q^2 = r^2a, \qquad 2pq = r^2b.$$
Suppose that $r$ has an odd prime factor $\rho$. Then the second of those displayed equations shows that $\rho$ is a factor of either $p$ or $q$. The first of the displayed equations then shows that $\rho$ is a factor of both $p$ and $q$. Thus $p$, $q$ and $r$ have the common factor $\rho$, contrary to the initial assumption.

Next, suppose that $r$ is even, say $r=2s$. Then the first displayed equation becomes $p^2+2q^2 = 4s^2a$, showing that $p$ must be even, say $p=2t.$ It follows that $2t^2+q^2 = 2s^2a$, showing that $q$ is even. Thus $p$, $q$ and $r$ have the common factor 2, again contrary to the initial assumption.

The conclusion is that $r$ has no prime factors at all and is therefore equal to 1, proving that $a+b\sqrt{2}$ has a square root in $\mathbb{Z}[\sqrt{2}].$
Thank You so much!
 

Similar threads

Replies
20
Views
4K
Replies
2
Views
2K
Replies
1
Views
2K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
1
Views
3K
  • · Replies 24 ·
Replies
24
Views
5K
  • · Replies 15 ·
Replies
15
Views
22K
  • · Replies 5 ·
Replies
5
Views
2K