How do computers evaluate the GCD of two integers?

  • I
  • Thread starter matqkks
  • Start date
  • #1
201
2
Do they use the Euclidean Algorithm?
 

Answers and Replies

  • #2
mathman
Science Advisor
7,901
460
Computers do not have minds of their own. A program to evaluate GCD of two integers would be written by a programmer, who would use whatever algorithm he/she feels best.
 
  • Like
Likes berkeman
  • #3
12,515
6,297
Check the rosettacode.org site someone may have developed the code in whatever language you are interested in.
 
  • #4
12,515
6,297
Here’s the Euclidean algorithm

https://en.m.wikipedia.org/wiki/Euclidean_algorithm

I imagine math software would use this algorithm to compute it as subtractions are relatively cheap compared to multiplications or divisions.

Another remote possibility is it uses a combination of lookup table and Euclidean algorithm to speed things up although that may take up more memory than necessary and depending on the table size more costly time wise.
 
  • Like
Likes Janosh89 and berkeman
  • #5
35,265
11,523
I imagine math software would use this algorithm to compute it as subtractions are relatively cheap compared to multiplications or divisions.
You still need some sort of division to calculate how much you have to subtract. Simply subtracting the smaller number from the larger number repeatedly has a really bad worst-case runtime if the numbers are very different (exponential, while divisions make the method scale with the input length squared).

There are alternative algorithms that can compute the GCD faster than quadratic runtime for longer inputs.
 
  • Like
Likes jedishrfu

Related Threads on How do computers evaluate the GCD of two integers?

Replies
4
Views
2K
  • Last Post
Replies
4
Views
1K
  • Last Post
Replies
4
Views
2K
Replies
2
Views
611
  • Last Post
Replies
2
Views
2K
  • Last Post
Replies
3
Views
6K
Replies
1
Views
1K
  • Last Post
Replies
2
Views
485
  • Last Post
Replies
10
Views
898
Replies
8
Views
2K
Top