How do computers evaluate the GCD of two integers?

In summary, there are various algorithms that can be used to compute the GCD of two integers, but the most commonly used and efficient one is the Euclidean algorithm. This algorithm involves using repeated subtractions and is relatively cheap compared to other methods. However, there are alternative algorithms that can compute the GCD faster for longer inputs.
  • #1
285
5
Do they use the Euclidean Algorithm?
 
Mathematics news on Phys.org
  • #2
Computers do not have minds of their own. A program to evaluate GCD of two integers would be written by a programmer, who would use whatever algorithm he/she feels best.
 
  • #3
Check the rosettacode.org site someone may have developed the code in whatever language you are interested in.
 
  • #4
Here’s the Euclidean algorithm

https://en.m.wikipedia.org/wiki/Euclidean_algorithm

I imagine math software would use this algorithm to compute it as subtractions are relatively cheap compared to multiplications or divisions.

Another remote possibility is it uses a combination of lookup table and Euclidean algorithm to speed things up although that may take up more memory than necessary and depending on the table size more costly time wise.
 
  • Like
Likes Janosh89 and berkeman
  • #5
jedishrfu said:
I imagine math software would use this algorithm to compute it as subtractions are relatively cheap compared to multiplications or divisions.
You still need some sort of division to calculate how much you have to subtract. Simply subtracting the smaller number from the larger number repeatedly has a really bad worst-case runtime if the numbers are very different (exponential, while divisions make the method scale with the input length squared).

There are alternative algorithms that can compute the GCD faster than quadratic runtime for longer inputs.
 

Suggested for: How do computers evaluate the GCD of two integers?

Replies
3
Views
806
Replies
1
Views
612
Replies
7
Views
840
Replies
1
Views
647
Back
Top