# How do computers evaluate the GCD of two integers?

• I
• matqkks
In summary, there are various algorithms that can be used to compute the GCD of two integers, but the most commonly used and efficient one is the Euclidean algorithm. This algorithm involves using repeated subtractions and is relatively cheap compared to other methods. However, there are alternative algorithms that can compute the GCD faster for longer inputs.

#### matqkks

Do they use the Euclidean Algorithm?

Computers do not have minds of their own. A program to evaluate GCD of two integers would be written by a programmer, who would use whatever algorithm he/she feels best.

berkeman
Check the rosettacode.org site someone may have developed the code in whatever language you are interested in.

Here’s the Euclidean algorithm

https://en.m.wikipedia.org/wiki/Euclidean_algorithm

I imagine math software would use this algorithm to compute it as subtractions are relatively cheap compared to multiplications or divisions.

Another remote possibility is it uses a combination of lookup table and Euclidean algorithm to speed things up although that may take up more memory than necessary and depending on the table size more costly time wise.

Janosh89 and berkeman
jedishrfu said:
I imagine math software would use this algorithm to compute it as subtractions are relatively cheap compared to multiplications or divisions.
You still need some sort of division to calculate how much you have to subtract. Simply subtracting the smaller number from the larger number repeatedly has a really bad worst-case runtime if the numbers are very different (exponential, while divisions make the method scale with the input length squared).

There are alternative algorithms that can compute the GCD faster than quadratic runtime for longer inputs.

jedishrfu