I'm trying to do the homework for a course I found online. A problem on the first homework goes as follows: Suppose A is an integral domain which is integrally closed in its fraction field K. Suppose q in A is not a square, so that K(sqrt(q)) is a quadratic extension of K. Describe the conditions on r,s in K which are necessary and sufficient for r+s*sqrt(q) to be integral over A in L. I have absolutely no clue how to approach this as A is not even assumed to be a UFD. The proof for A=Z uses the fact that Z is a UFD, so the minimal polynomial over the fraction field equals the minimal polynomial over A for every integral element (Gauss lemma). Does anyone have any ideas on how to approach this? Thanks.