How do computers evaluate the GCD of two integers?

  • I
  • Thread starter matqkks
  • Start date
  • #1
205
2
Do they use the Euclidean Algorithm?
 

Answers and Replies

  • #2
mathman
Science Advisor
8,046
535
Computers do not have minds of their own. A program to evaluate GCD of two integers would be written by a programmer, who would use whatever algorithm he/she feels best.
 
  • #3
13,578
7,566
Check the rosettacode.org site someone may have developed the code in whatever language you are interested in.
 
  • #4
13,578
7,566
Here’s the Euclidean algorithm

https://en.m.wikipedia.org/wiki/Euclidean_algorithm

I imagine math software would use this algorithm to compute it as subtractions are relatively cheap compared to multiplications or divisions.

Another remote possibility is it uses a combination of lookup table and Euclidean algorithm to speed things up although that may take up more memory than necessary and depending on the table size more costly time wise.
 
  • Like
Likes Janosh89 and berkeman
  • #5
35,916
12,743
I imagine math software would use this algorithm to compute it as subtractions are relatively cheap compared to multiplications or divisions.
You still need some sort of division to calculate how much you have to subtract. Simply subtracting the smaller number from the larger number repeatedly has a really bad worst-case runtime if the numbers are very different (exponential, while divisions make the method scale with the input length squared).

There are alternative algorithms that can compute the GCD faster than quadratic runtime for longer inputs.
 

Related Threads on How do computers evaluate the GCD of two integers?

Replies
4
Views
2K
  • Last Post
Replies
7
Views
547
  • Last Post
Replies
11
Views
3K
  • Last Post
Replies
4
Views
2K
  • Last Post
Replies
3
Views
2K
  • Last Post
Replies
4
Views
1K
Replies
2
Views
755
  • Last Post
Replies
2
Views
2K
  • Last Post
Replies
3
Views
6K
Top