How do computers evaluate the GCD of two integers?

  • I
  • Thread starter matqkks
  • Start date
  • #1
201
2

Main Question or Discussion Point

Do they use the Euclidean Algorithm?
 

Answers and Replies

  • #2
mathman
Science Advisor
7,756
413
Computers do not have minds of their own. A program to evaluate GCD of two integers would be written by a programmer, who would use whatever algorithm he/she feels best.
 
  • #3
11,343
4,809
Check the rosettacode.org site someone may have developed the code in whatever language you are interested in.
 
  • #4
11,343
4,809
Here’s the Euclidean algorithm

https://en.m.wikipedia.org/wiki/Euclidean_algorithm

I imagine math software would use this algorithm to compute it as subtractions are relatively cheap compared to multiplications or divisions.

Another remote possibility is it uses a combination of lookup table and Euclidean algorithm to speed things up although that may take up more memory than necessary and depending on the table size more costly time wise.
 
  • #5
33,891
9,608
I imagine math software would use this algorithm to compute it as subtractions are relatively cheap compared to multiplications or divisions.
You still need some sort of division to calculate how much you have to subtract. Simply subtracting the smaller number from the larger number repeatedly has a really bad worst-case runtime if the numbers are very different (exponential, while divisions make the method scale with the input length squared).

There are alternative algorithms that can compute the GCD faster than quadratic runtime for longer inputs.
 

Related Threads for: How do computers evaluate the GCD of two integers?

Replies
4
Views
2K
  • Last Post
Replies
4
Views
957
  • Last Post
Replies
4
Views
2K
Replies
2
Views
476
  • Last Post
Replies
2
Views
2K
  • Last Post
Replies
3
Views
6K
  • Last Post
Replies
2
Views
405
Replies
1
Views
1K
Top