# GCD proof in Number Theory

Here is my problem:

Prove or disprove: If gcd(m, n) = d, then the gcd(a, mn) = gcd(a,m) * gcd(a.n)/d.

I can seem to get it started, sort of, but it just does not seem to get anywhere. I know by definition d | m and d | n. Then arbitrary integers x and y can be used such that m = xd and n = yd. I can then have gcd(a, mn) = E. Then E | a and E | mn. I can do the same as I did for the other. But the last part of gcd(a,m) * gcd(a.n)/d is giving me a really rough time getting it to know how tie it all in. I have a lot of integers and letters and I think I am very lost at this point. Please help!

melmath

Last edited:

Related Calculus and Beyond Homework Help News on Phys.org
AKG
Homework Helper
Do you know the fundamental theorem of arithmetic which says that every number can be written uniquely as a product of primes? So we can write:

$$m = \prod _{i=1}^{\infty}p_i^{\mu _i},\ n = \prod _{i=1}^{\infty}p_i^{\nu _i}$$

where pi is the ith prime, and the $\mu _i,\, \nu _i$ are eventually zero since m and n are finite numbers. Using the notation above, how would you express gcd(m,n)?

Last edited:
matt grime