The complexity of Euclid's greatest common div

2019-07-27 06:59发布

问题:

Consider this implementation of Euclid's algorithm:

function gcd(a, b)
    while b ≠ 0
       t := b
       b := a mod b
       a := t
    return a

A nice proof on wikipedia shows that the algorithm "always needs less than O(h) divisions, where h is the number of digits in the smaller number b".

However, on a Turing Machine a procedure that computes a mod b has a time complexity of O(a+b). My intuition and some large tests tell me that the complexity for Euclid's algorithm is still O(a+b) on a Turing Machine.

Any thoughts on how to prove that?

回答1:

If I were implementing GCD on binary numbers on a Turing machine, I'd implement the following algorithm. I can't see how it would be smaller than O((ln a + ln b)^2) though. The most efficient representation I think would be bitwise interleaving both values after step 2.

  1. Let s1 = the number of zeros in the least significant bits of a. Delete these bottom s1 zero bits.
  2. Let s2 = the number of zeros in the least significant bits of b. Delete these bottom s2 zero bits.
  3. Let s = min(s1,s2)
  4. Now a and b are both odd. If b < a then swap a and b.
  5. b >= a. Set b = b - a, then delete all the least significant zero bits from b.
  6. If b != 0, goto 4.
  7. Add s zero bits onto the end of a. This is the result.