Computers evaluate the greatest common divisor (gcd) of two integers primarily using Euclid's algorithm, which is an efficient method based on the principle that the gcd of two numbers also divides their difference. The algorithm repeatedly replaces the larger number with the remainder of the division of the two numbers until one of them reaches zero, at which point the other number is the gcd. This method is favored for its simplicity and effectiveness in computational applications. Understanding this algorithm is essential for programming tasks involving number theory and integer calculations. Euclid's algorithm remains a foundational concept in computer science for gcd evaluation.